diff --git a/.DS_Store b/.DS_Store index c8bedc41..7ecc854b 100644 Binary files a/.DS_Store and b/.DS_Store differ diff --git a/.github/workflows/python-package-conda.yml b/.github/workflows/python-package-conda.yml new file mode 100644 index 00000000..f3586044 --- /dev/null +++ b/.github/workflows/python-package-conda.yml @@ -0,0 +1,34 @@ +name: Python Package using Conda + +on: [push] + +jobs: + build-linux: + runs-on: ubuntu-latest + strategy: + max-parallel: 5 + + steps: + - uses: actions/checkout@v4 + - name: Set up Python 3.10 + uses: actions/setup-python@v3 + with: + python-version: '3.10' + - name: Add conda to system path + run: | + # $CONDA is an environment variable pointing to the root of the miniconda directory + echo $CONDA/bin >> $GITHUB_PATH + - name: Install dependencies + run: | + conda env update --file environment.yml --name base + - name: Lint with flake8 + run: | + conda install flake8 + # stop the build if there are Python syntax errors or undefined names + flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics + # exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide + flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics + - name: Test with pytest + run: | + conda install pytest + pytest diff --git a/README.md b/README.md index eccda7f2..2b940efb 100644 --- a/README.md +++ b/README.md @@ -1,7 +1,19 @@ # TorchSpatial: A Location Encoding Framework and Benchmark for Spatial Representation Learning 🚧 Constructing... -![TorchSpatial Overall Framework](figs/TorchSpatial_task4.jpg) +## Overview +TorchSpatial offers a comprehensive framework and benchmark suite designed to advance spatial representation learning (SRL). It supports the development and evaluation of location encoders using extensive benchmarks and innovative evaluation metrics. + +## Features +- **Unified Framework**: Integrates 15 recognized location encoders to enhance scalability and reproducibility. +- **LocBench Benchmark**: Includes 17 datasets for geo-aware image classification and regression, enabling thorough performance assessments across various geographic distributions. +- **Geo-Bias Score**: A novel metric to evaluate model performance and geographic bias, promoting spatial fairness in GeoAI applications. + +## Availability +Access the TorchSpatial framework, LocBench benchmarks, and evaluation metrics on GitHub: [TorchSpatial GitHub Repository](https://github.com/seai-lab/TorchSpatial). + +## Overall Framework +![TorchSpatial Overall Framework](figs/TorchSpatial4_10regTasks.jpg) @@ -10,9 +22,11 @@ ## Data Download Instructions The data can be downloaded from the following DOI link: -[Download Data](https://doi.org/10.6084/m9.figshare.26026798) - -Data should be organized following the .. +[Download Data](https://doi.org/10.6084/m9.figshare.26026798 + + + + ) ## Code Execution The example bash files for running the codes can be found in main/run_bash folder @@ -68,8 +82,15 @@ All our experiments were conducted on a Ubuntu workstation equipped with 4 NVIDI ### Reference -If you find our work useful in your research please consider citing [our ISPRS PHOTO 2023 paper](https://www.researchgate.net/publication/371964548_Sphere2Vec_A_General-Purpose_Location_Representation_Learning_over_a_Spherical_Surface_for_Large-Scale_Geospatial_Predictions). +If you find our work useful in your research please consider citing our TorchSpatial paper and [ISPRS PHOTO 2023 paper](https://www.researchgate.net/publication/371964548_Sphere2Vec_A_General-Purpose_Location_Representation_Learning_over_a_Spherical_Surface_for_Large-Scale_Geospatial_Predictions). ``` +@article{wu2024torchspatial, + title={TorchSpatial: A Location Encoding Framework and Benchmark for Spatial Representation Learning}, + author={Wu, Nemin and Cao, Qian and Wang, Zhangyu and Liu, Zeping and Qi, Yanlin and Zhang, Jielu and Ni, Joshua and Yao, Xiaobai and Ma, Hongxu and Mu, Lan and Ermon, Stefano and Ganu, Tanuja and Nambi, Akshay and Lao, Ni and Mai, Gengchen}, + journal={arXiv preprint arXiv:2406.15658}, + year={2024} +} + @article{mai2023sphere2vec, title={Sphere2Vec: A General-Purpose Location Representation Learning over a Spherical Surface for Large-Scale Geospatial Predictions}, author={Mai, Gengchen and Xuan, Yao and Zuo, Wenyun and He, Yutong and Song, Jiaming and Ermon, Stefano and Janowicz, Krzysztof and Lao, Ni}, @@ -111,3 +132,6 @@ If you use the unsupervised learning function, please also cite [our ICML 2023 p organization={PMLR} } ``` + +### License +Our code is under MIT license. All data products created through our work that are not covered under upstream licensing agreements are available via a CC BY-NC 4.0 license. All upstream data use restrictions take precedence over this license. diff --git a/documentation/Encoders/NeRF.md b/documentation/Encoders/NeRF.md deleted file mode 100644 index 1a130be9..00000000 --- a/documentation/Encoders/NeRF.md +++ /dev/null @@ -1,91 +0,0 @@ -# NERFSpatialRelationLocationEncoder - -## Overview -The `NERFSpatialRelationLocationEncoder` is designed to compute spatial embeddings from coordinate data using a Neural Radiance Field (NeRF) based encoding approach. This encoder integrates a position encoding strategy, leveraging a [`NERFSpatialRelationPositionEncoder`](#NERFSpatialRelationPositionEncoder), and further processes the encoded positions through a customizable multi-layer feed-forward neural network. - -## Features -- **Position Encoding (`self.position_encoder`)**: Utilizes the [`NERFSpatialRelationPositionEncoder`](#NERFSpatialRelationPositionEncoder) to encode spatial differences (latitude, longitude) using NeRF-inspired sinusoidal functions. -- **Feed-Forward Neural Network (`self.ffn`)**: Transforms the position-encoded data through a series of feed-forward layers to produce high-dimensional spatial embeddings. - -## Configuration Parameters -- `spa_embed_dim`: Dimensionality of the spatial embedding output. -- `coord_dim`: Dimensionality of the coordinate space (e.g., 2D, 3D). -- `device`: Computation device (e.g., 'cuda' for GPU). -- `frequency_num`: Number of frequency components used in positional encoding. -- `freq_init`: Initial setting for frequency calculation, set to 'nerf' for NeRF-specific frequency calculations. -- `ffn_act`: Activation function for the feed-forward layers. -- `ffn_num_hidden_layers`: Number of hidden layers in the feed-forward network. -- `ffn_dropout_rate`: Dropout rate used in the feed-forward network. -- `ffn_hidden_dim`: Dimension of each hidden layer in the feed-forward network. -- `ffn_use_layernormalize`: Flag to enable layer normalization in the feed-forward network. -- `ffn_skip_connection`: Flag to enable skip connections in the feed-forward network. -- `ffn_context_str`: Context string for debugging and detailed logging within the network. - -## Methods -### `forward(coords)` -- **Purpose**: Processes input coordinates through the location encoder to produce final spatial embeddings. -- **Parameters**: - - `coords` (List or np.ndarray): Coordinates to process, expected to be in the form `(batch_size, num_context_pt, coord_dim)`. -- **Returns**: - - `sprenc` (Tensor): Spatial relation embeddings with a shape of `(batch_size, num_context_pt, spa_embed_dim)`. - -> ## NERFSpatialRelationPositionEncoder - -### Features -

- NeRF-transformation -

- -### Configuration Parameters -- **coord_dim**: Dimensionality of the space being encoded (e.g., 2D, 3D). -- **frequency_num**: Number of different sinusoidal frequencies used to encode spatial differences. -- **freq_init**: Frequency initialization method, set to 'nerf' for NeRF-based encoding. -- **device**: Specifies the computational device, e.g., 'cuda' for GPU acceleration. - -### Methods - -#### `cal_freq_list()` -Calculates the list of frequencies used for the sinusoidal encoding based on the NeRF methodology, using an exponential scaling of frequencies. -- **Modifies**: - - Internal frequency list based on the specified initialization method. - -#### `cal_freq_mat()` -Creates a frequency matrix to be used in the encoding process. -- **Modifies**: - - Internal frequency matrix to match the dimensions required for vectorized operations. - -#### `make_output_embeds(coords)` -Processes a batch of coordinates and converts them into spatial relation embeddings. -- **Parameters**: - - `coords`: Batch of geographic coordinates. -- **Returns**: - - Batch of spatial relation embeddings in high-dimensional space. - -### Implementation Details -- Converts longitude and latitude to radians, then to Cartesian coordinates assuming a unit sphere. -- Applies sinusoidal functions to these Cartesian coordinates, scaled by the computed frequencies. -- Outputs high-dimensional embeddings based on these sinusoidally encoded coordinates. - -## Usage Example -```python -# Initialize the encoder -encoder = NERFSpatialRelationLocationEncoder( - spa_embed_dim=64, - coord_dim=2, - device="cuda", - frequency_num=16, - freq_init="nerf", - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="NERFSpatialRelationEncoder" -) - -# Sample coordinates -coords = np.array([[34.0522, -118.2437],..., [40.7128, -74.0060]]) # Example: [latitude, longitude] - -# Generate spatial embeddings -embeddings = encoder.forward(coords) diff --git a/documentation/Encoders/Space2Vec-grid.md b/documentation/Encoders/Space2Vec-grid.md deleted file mode 100644 index 962db8f8..00000000 --- a/documentation/Encoders/Space2Vec-grid.md +++ /dev/null @@ -1,106 +0,0 @@ -# Space2Vec-grid (GridCellSpatialRelationLocationEncoder) - -## Overview -The `GridCellSpatialRelationLocationEncoder` is designed for encoding spatial relations between locations. This encoder integrates a position encoding strategy, leveraging a `GridCellSpatialRelationPositionEncoder`, and further processes the encoded positions through a customizable multi-layer feed-forward neural network. - -## Features -- **Position Encoding (`self.position_encoder`)**: Utilizes the [`GridCellSpatialRelationPositionEncoder`](#GridCellSpatialRelationPositionEncoder) to encode spatial differences (deltaX, deltaY) based on sinusoidal functions. -- **Feed-Forward Neural Network (`self.ffn`)**: Transforms the position-encoded data through a series of feed-forward layers to produce high-dimensional spatial embeddings. - -## Configuration Parameters -- `spa_embed_dim`: Dimensionality of the spatial embedding output. -- `coord_dim`: Dimensionality of the coordinate space (e.g., 2D, 3D). -- `frequency_num`: Number of different frequencies used for the sinusoidal encoding. -- `max_radius`: The maximum context radius the model can handle. -- `min_radius`: The minimum context radius, important for defining the scale of positional encoding. -- `freq_init`: Method of initializing the frequency list ('random', 'geometric', 'nerf'). -- `device`: Computation device (e.g., 'cuda' for GPU). -- `ffn_act`: Activation function for the feed-forward layers. -- `ffn_num_hidden_layers`: Number of hidden layers in the feed-forward network. -- `ffn_dropout_rate`: Dropout rate used in the feed-forward network. -- `ffn_hidden_dim`: Dimension of each hidden layer in the feed-forward network. -- `ffn_use_layernormalize`: Flag to enable layer normalization in the feed-forward network. -- `ffn_skip_connection`: Flag to enable skip connections in the feed-forward network. -- `ffn_context_str`: Context string for debugging and detailed logging within the network. - -## Methods -### `forward(coords)` -- **Purpose**: Processes input coordinates through the location encoder to produce final spatial embeddings. -- **Parameters**: - - `coords` (List or np.ndarray): Coordinates to process, expected to be in the form `(batch_size, num_context_pt, coord_dim)`. -- **Returns**: - - `sprenc` (Tensor): Spatial relation embeddings with a shape of `(batch_size, num_context_pt, spa_embed_dim)`. - -> ## GridCellSpatialRelationPositionEncoder - -### Features -- **Sinusoidal Encoding**: Utilizes sinusoidal functions to encode spatial differences, allowing for the representation of these differences in a form that neural networks can more effectively learn from. - -### Configuration Parameters -- **coord_dim**: Dimensionality of the space being encoded (e.g., 2D, 3D). -- **frequency_num**: Number of different sinusoidal frequencies used to encode spatial differences. -- **max_radius**: Maximum spatial context radius, defining the upper scale of encoding. -- **min_radius**: Minimum spatial context radius, defining the lower scale of encoding. -- **freq_init**: Method to initialize the frequency list, can be 'random', 'geometric', or 'nerf'. -- **device**: Specifies the computational device, e.g., 'cuda' for GPU acceleration. - -### Methods - -#### `cal_elementwise_angle(coord, cur_freq)` -Calculates the angle for sinusoidal function based on the coordinate difference and current frequency. -- **Parameters**: - - `coord`: Spatial difference (deltaX or deltaY). - - `cur_freq`: Current frequency index. -- **Returns**: - - Calculated angle for sinusoidal transformation. - -#### `cal_coord_embed(coords_tuple)` -Converts a tuple of coordinates into an embedded format using sinusoidal encoding. -- **Parameters**: - - `coords_tuple`: Tuple containing deltaX and deltaY. -- **Returns**: - - High-dimensional vector representing the embedded coordinates. - -#### `cal_pos_enc_output_dim()` -Calculates the output dimension of the position-encoded spatial relationship. -- **Returns**: - - The dimension of the encoded spatial relation embedding. - -#### `cal_freq_list()` -Calculates the list of frequencies used for the sinusoidal encoding based on the initialization method specified. -- **Modifies**: - - Internal frequency list based on the maximum and minimum radii and the total number of frequencies. - -#### `cal_freq_mat()` -Generates a matrix of frequencies to be used for batch processing of spatial data. -- **Modifies**: - - Internal frequency matrix to match the dimensions required for vectorized operations. - -#### `make_output_embeds(coords)` -Processes a batch of coordinates and converts them into spatial relation embeddings. -- **Parameters**: - - `coords`: Batch of spatial differences. -- **Returns**: - - Batch of spatial relation embeddings in high-dimensional space. -> -## Usage Example -```python -encoder = GridCellSpatialRelationLocationEncoder( - spa_embed_dim=64, - coord_dim=2, - frequency_num=16, - max_radius=10000, - min_radius=10, - freq_init="geometric", - device="cuda", - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="GridCellSpatialRelationEncoder" -) - -coords = np.array([...]) # your coordinate data -embeddings = encoder.forward(coords) diff --git a/documentation/Encoders/Space2Vec-theory.md b/documentation/Encoders/Space2Vec-theory.md deleted file mode 100644 index d47ac979..00000000 --- a/documentation/Encoders/Space2Vec-theory.md +++ /dev/null @@ -1,101 +0,0 @@ -# Space2Vec-theory - TheoryGridCellSpatialRelationLocationEncoder - -## Overview -The `TheoryGridCellSpatialRelationLocationEncoder` extends the `LocationEncoder` to encode spatial relationships between locations using advanced theoretical methods. This encoder uses a specialized position encoder (`TheoryGridCellSpatialRelationPositionEncoder`) to transform spatial differences into a high-dimensional space, and further processes these embeddings through a custom multi-layer feed-forward neural network. - -## Features -- **Position Encoding**: Utilizes the [`TheoryGridCellSpatialRelationPositionEncoder`](#TheoryGridCellSpatialRelationPositionEncoder) for converting spatial differences into encoded positions based on specified frequencies and radii. -- **Feed-Forward Neural Network**: Processes the position-encoded data through a multi-layer neural network, customizable in terms of architecture and activation functions. - -## Configuration Parameters -- `spa_embed_dim`: The dimensionality of the output spatial embeddings. -- `coord_dim`: Dimensionality of the coordinate space (e.g., 2D). -- `frequency_num`: The number of different frequencies used for sinusoidal encoding. -- `max_radius`: The largest context radius the model can handle. -- `min_radius`: The smallest context radius, essential for defining the scale of positional encoding. -- `freq_init`: Method for initializing the frequency list ('geometric' by default). -- `device`: Computation device (e.g., 'cuda' for GPU operations). -- `ffn_act`: Activation function used in the feed-forward network. -- `ffn_num_hidden_layers`: Number of hidden layers in the feed-forward network. -- `ffn_dropout_rate`: Dropout rate used in the network. -- `ffn_hidden_dim`: Dimension of each hidden layer in the network. -- `ffn_use_layernormalize`: Boolean flag to enable layer normalization in the network. -- `ffn_skip_connection`: Boolean flag to enable skip connections in the network. -- `ffn_context_str`: A string identifier used for context-specific logging or debugging. - -> ## TheoryGridCellSpatialRelationPositionEncoder -### Key Features and Enhancements: - -- **Multi-Angular Encoding:** Utilizes three unit vectors positioned at 0, 120, and 240 degrees to capture directional nuances in spatial relationships. -- **Expanded Frequency Matrix:** Extends the frequency matrix to accommodate the encoding across these three directions, effectively tripling the dimensionality involved in the encoding process. -- **Sinusoidal Encoding Across Angles:** Applies sinusoidal functions to the dot products of spatial differences and unit vectors, scaled by the frequencies to produce embeddings that capture both magnitude and directional information. - -### Function Descriptions - -#### `__init__(...)` - -Initializes the encoder with configurable parameters for spatial dimensions, frequency of encoding, and operational devices. - -##### Parameters: - -- `coord_dim`: Dimensionality of the space being encoded (e.g., 2D, 3D). -- `frequency_num`: Number of different sinusoidal frequencies used. -- `max_radius`: Largest context radius the model can handle. -- `min_radius`: Smallest context radius, defining the lower scale of encoding. -- `freq_init`: Method for initializing the frequency list (e.g., 'geometric'). -- `device`: Computational device (e.g., 'cuda'). - -#### `cal_freq_mat()` - -Adjusts the frequency matrix to match the expanded encoding scheme, allowing for a six-dimensional representation per frequency due to the triple angular approach. - -#### `cal_pos_enc_output_dim()` - -Calculates the output dimension of the position-encoded spatial relation embedding, considering the expanded dimensionality due to multiple angles. - -#### `make_output_embeds(coords)` - -Processes a batch of coordinates, computes the dot products with unit vectors, applies sinusoidal encoding, and integrates these across specified frequencies. - -##### Parameters: - -- `coords`: Batch of spatial differences (deltaX, deltaY). - -##### Returns: - -Spatial relation embeddings in a high-dimensional space. - -#### `forward(coords)` - -Feeds the processed coordinates through the encoder to produce final spatial embeddings suitable for further processing or model input. - -##### Parameters: - -- `coords`: Coordinates to process. - -##### Returns: - -Tensor of spatial relation embeddings. - - -## Usage Example -```python -encoder = TheoryGridCellSpatialRelationLocationEncoder( - spa_embed_dim=64, - coord_dim=2, - frequency_num=16, - max_radius=10000, - min_radius=10, - freq_init="geometric", - device="cuda", - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="TheoryGridCellSpatialRelationEncoder" -) - -coords = np.array([...]) # your coordinate data -embeddings = encoder.forward(coords) diff --git a/documentation/Encoders/Sphere2Vec-dfs.md b/documentation/Encoders/Sphere2Vec-dfs.md deleted file mode 100644 index 0f40c16b..00000000 --- a/documentation/Encoders/Sphere2Vec-dfs.md +++ /dev/null @@ -1,98 +0,0 @@ -# DFTSpatialRelationLocationEncoder Documentation - -## Overview -The `DFTSpatialRelationLocationEncoder` is designed to process spatial relations between locations using Discrete Fourier Transform (DFT) principles adapted for spatial encoding. It utilizes the `DFTSpatialRelationPositionEncoder` to transform spatial coordinates into a frequency domain, enhancing the model's ability to capture and interpret spatial relationships across various scales. - -## Features -- **Position Encoding (`self.position_encoder`)**: Utilizes `DFTSpatialRelationPositionEncoder` for transforming spatial differences into frequency-based representations. -- **Feed-Forward Neural Network (`self.ffn`)**: Processes the frequency domain data through a multi-layer neural network to generate final spatial embeddings. - -## Configuration Parameters -- `spa_embed_dim`: Dimensionality of the output spatial embeddings. -- `coord_dim`: Dimensionality of the coordinate space (typically 2D). -- `frequency_num`: Number of frequency components used in the positional encoding. -- `max_radius`: Maximum distance considered for spatial interactions. -- `min_radius`: Minimum distance that can be resolved by the encoding. -- `freq_init`: Method used for initializing the frequency components ('geometric' suggests a regular scaling). -- `device`: Computation device used (e.g., 'cuda' for GPU acceleration). -- `ffn_act`: Activation function for the neural network layers. -- `ffn_num_hidden_layers`: Number of hidden layers in the neural network. -- `ffn_dropout_rate`: Dropout rate to prevent overfitting during training. -- `ffn_hidden_dim`: Dimension of each hidden layer within the network. -- `ffn_use_layernormalize`: Whether to use layer normalization. -- `ffn_skip_connection`: Whether to include skip connections within the network layers. -- `ffn_context_str`: Context string for debugging and detailed logging within the network. - -## Methods -### `forward(coords)` -- **Purpose**: Processes input coordinates through the encoder to produce spatial embeddings. -- **Parameters**: - - `coords` (List or np.ndarray): Coordinates to process, formatted as `(batch_size, num_context_pt, coord_dim)`. -- **Returns**: - - `sprenc` (Tensor): The final spatial relation embeddings, shaped `(batch_size, num_context_pt, spa_embed_dim)`. - -> ## DFTSpatialRelationPositionEncoder - -### Overview -This position encoder leverages Discrete Fourier Transform (DFT) techniques to encode spatial coordinates into the frequency domain, enabling the model to recognize patterns and relationships that are not immediately apparent in the spatial domain. -

- dfs-transformation -

-### Features -- **Frequency Domain Conversion**: Transforms spatial data into a frequency-based representation, capturing inherent spatial frequencies and patterns effectively. -- **Multi-Scale Analysis**: By varying the number of frequencies and their initialization, the encoder can adapt to different spatial scales and resolutions. - -### Configuration Parameters -- `coord_dim`: Dimensionality of the space being encoded. -- `frequency_num`: Number of different frequencies used in the encoding. -- `max_radius`: The maximum effective radius for the encoding, influencing the lowest frequency. -- `min_radius`: The minimum effective radius, influencing the highest frequency. -- `freq_init`: The method for initializing the frequencies, impacting how spatial scales are represented. -- `device`: Specifies the computation device. - -### Methods - -#### `cal_elementwise_angle(coord, cur_freq)` -- **Description**: Calculates the angle for each frequency based on the spatial coordinate. -- **Parameters**: - - `coord`: Spatial difference, either deltaX or deltaY. - - `cur_freq`: Current frequency index. -- **Returns**: - - Computed angle for the transformation. - -#### `cal_coord_embed(coords_tuple)` -- **Description**: Encodes a set of coordinates into their frequency domain representations. -- **Parameters**: - - `coords_tuple`: A tuple of spatial differences. -- **Returns**: - - High-dimensional vector representing the frequency domain embeddings. - -#### `make_output_embeds(coords)` -- **Description**: Converts input spatial data into a comprehensive set of frequency domain features. -- **Parameters**: - - `coords`: Spatial coordinates to encode. -- **Returns**: - - High-dimensional embeddings that represent the input data in the frequency domain. - -## Usage Example -```python -# Initialize the encoder -encoder = DFTSpatialRelationLocationEncoder( - spa_embed_dim=64, - coord_dim=2, - frequency_num=16, - max_radius=10000, - min_radius=10, - freq_init="geometric", - device="cuda", - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="DFTSpatialRelationEncoder" -) - -coords = np.array([[34.0522, -118.2437], [40.7128, -74.0060]]) # Example coordinate data -embeddings = encoder.forward(coords) diff --git a/documentation/Encoders/Sphere2Vec-sphereC+.md b/documentation/Encoders/Sphere2Vec-sphereC+.md deleted file mode 100644 index c7e0cf5a..00000000 --- a/documentation/Encoders/Sphere2Vec-sphereC+.md +++ /dev/null @@ -1,82 +0,0 @@ -# SphereGridSpatialRelationLocationEncoder Documentation - -## Overview -The `SphereGridSpatialRelationLocationEncoder` is engineered for encoding spatial relationships between locations. It leverages the `SphereGridSpatialRelationPositionEncoder` to initially encode spatial differences, then processes these through a customizable multi-layer feed-forward neural network to produce high-dimensional spatial embeddings. - -## Features -- **Position Encoding**: Uses the `SphereGridSpatialRelationPositionEncoder` for encoding spatial differences using sinusoidal functions. -- **Feed-Forward Neural Network**: Converts the position-encoded data into spatial embeddings through multiple neural network layers. - -## Configuration Parameters -- **spa_embed_dim**: The dimensionality of the spatial embedding output. -- **coord_dim**: The dimensionality of the coordinate space (e.g., 2D, 3D). -- **device**: Computation device (e.g., 'cuda'). -- **frequency_num**: Number of frequency components used in positional encoding. -- **max_radius**: Maximum spatial context radius. -- **min_radius**: Minimum spatial context radius. -- **freq_init**: Initialization method for frequency calculation, set to 'geometric'. -- **ffn_act**: Activation function for the feed-forward layers. -- **ffn_num_hidden_layers**: Number of hidden layers in the feed-forward network. -- **ffn_dropout_rate**: Dropout rate used in the feed-forward network. -- **ffn_hidden_dim**: Dimension of each hidden layer in the feed-forward network. -- **ffn_use_layernormalize**: Flag to enable layer normalization in the network. -- **ffn_skip_connection**: Flag to enable skip connections in the network. -- **ffn_context_str**: Context string for debugging and detailed logging. - -## Methods -### `forward(coords)` -- **Purpose**: Processes input coordinates through the encoder to produce final spatial embeddings. -- **Parameters**: - - `coords` (List or np.ndarray): Coordinates to be processed, expected in the format `(batch_size, num_context_pt, coord_dim)`. -- **Returns**: - - `sprenc` (Tensor): Spatial relation embeddings, shaped `(batch_size, num_context_pt, spa_embed_dim)`. - -> ## SphereGridSpatialRelationPositionEncoder - -### Features -- **Sinusoidal Encoding**: Applies sinusoidal functions to encode spatial differences, enhancing the model's ability to learn from these features. -- **Configurable Parameters**: Supports customization of encoding parameters such as space dimensionality and computation device. -

- sphereC-plus-transformation -

-### Configuration Parameters -- **coord_dim**: Dimensionality of the space being encoded (e.g., 2D, 3D). -- **frequency_num**: Number of frequencies used in sinusoidal encoding. -- **device**: Specifies the computational device. - -### Methods -#### `make_output_embeds(coords)` -- **Description**: Converts a batch of coordinates into spatial relation embeddings. -- **Parameters**: - - `coords`: Spatial differences to be encoded. -- **Returns**: - - Spatial relation embeddings in high-dimensional space. - -#### `forward(coords)` -- **Description**: Feeds processed coordinates through the encoder to generate final spatial embeddings. -- **Parameters**: - - `coords`: Coordinates to process. -- **Returns**: - - Tensor of spatial relation embeddings. - -## Usage Example -```python -encoder = SphereGridSpatialRelationLocationEncoder( - spa_embed_dim=64, - coord_dim=2, - device="cuda", - frequency_num=16, - max_radius=10000, - min_radius=10, - freq_init="geometric", - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="SphereGridSpatialRelationEncoder" -) - -coords = np.array([[34.0522, -118.2437], [40.7128, -74.0060]]) # Example coordinate data -embeddings = encoder.forward(coords) diff --git a/documentation/Encoders/Sphere2Vec-sphereC.md b/documentation/Encoders/Sphere2Vec-sphereC.md deleted file mode 100644 index ee7a7c22..00000000 --- a/documentation/Encoders/Sphere2Vec-sphereC.md +++ /dev/null @@ -1,121 +0,0 @@ -# SphereSpatialRelationLocationEncoder - -## Overview -The `SphereSpatialRelationLocationEncoder` is designed for encoding spatial relations between locations using a spherical coordinate system. This encoder integrates a position encoding strategy, leveraging a [`SphereSpatialRelationPositionEncoder`](#SphereSpatialRelationPositionEncoder), and further processes the encoded positions through a customizable multi-layer feed-forward neural network. - -## Features -- **Position Encoding (`self.position_encoder`)**: Utilizes the [`SphereSpatialRelationPositionEncoder`](#SphereSpatialRelationPositionEncoder) to encode spatial differences (deltaX, deltaY) using sinusoidal functions. -- **Feed-Forward Neural Network (`self.ffn`)**: Transforms the position-encoded data through a series of feed-forward layers to produce high-dimensional spatial embeddings. - -## Configuration Parameters -- `spa_embed_dim`: Dimensionality of the spatial embedding output. -- `coord_dim`: Dimensionality of the coordinate space (e.g., 2D, 3D). -- `frequency_num`: Number of different frequencies used for the sinusoidal encoding. -- `max_radius`: The maximum context radius the model can handle. -- `min_radius`: The minimum context radius, important for defining the scale of positional encoding. -- `freq_init`: Method of initializing the frequency list ('random', 'geometric', 'nerf'). -- `device`: Computation device (e.g., 'cuda' for GPU). -- `ffn_act`: Activation function for the feed-forward layers. -- `ffn_num_hidden_layers`: Number of hidden layers in the feed-forward network. -- `ffn_dropout_rate`: Dropout rate used in the feed-forward network. -- `ffn_hidden_dim`: Dimension of each hidden layer in the feed-forward network. -- `ffn_use_layernormalize`: Flag to enable layer normalization in the feed-forward network. -- `ffn_skip_connection`: Flag to enable skip connections in the feed-forward network. -- `ffn_context_str`: Context string for debugging and detailed logging within the network. - -## Methods -### `forward(coords)` -- **Purpose**: Processes input coordinates through the location encoder to produce final spatial embeddings. -- **Parameters**: - - `coords` (List or np.ndarray): Coordinates to process, expected to be in the form `(batch_size, num_context_pt, coord_dim)`. -- **Returns**: - - `sprenc` (Tensor): Spatial relation embeddings with a shape of `(batch_size, num_context_pt, spa_embed_dim)`. - -> ## SphereSpatialRelationPositionEncoder - -### Overview -

- Sphere2Vec-sphereC-transformation -

-#### Spherical Coordinate Transformation - -- The encoder first transforms geographical coordinates (longitude and latitude) from degrees to radians. -- These coordinates are then converted to Cartesian coordinates (x, y, z) on a unit sphere. - -#### Sinusoidal Encoding - -- The Cartesian coordinates are scaled using a set of predefined frequencies. -- Sinusoidal functions (sine and cosine) are applied to these scaled coordinates to produce the final embeddings. - -### Configuration Parameters -- **coord_dim**: Dimensionality of the space being encoded (e.g., 2D, 3D). -- **frequency_num**: Number of different sinusoidal frequencies used to encode spatial differences. -- **max_radius**: Maximum spatial context radius, defining the upper scale of encoding. -- **min_radius**: Minimum spatial context radius, defining the lower scale of encoding. -- **freq_init**: Method to initialize the frequency list, can be 'random', 'geometric', or 'nerf'. -- **device**: Specifies the computational device, e.g., 'cuda' for GPU acceleration. - -### Methods - -#### `cal_elementwise_angle(coord, cur_freq)` -Calculates the angle for sinusoidal function based on the coordinate difference and current frequency. -- **Parameters**: - - `coord`: Spatial difference (deltaX or deltaY). - - `cur_freq`: Current frequency index. -- **Returns**: - - Calculated angle for sinusoidal transformation. - -#### `cal_coord_embed(coords_tuple)` -Converts a tuple of coordinates into an embedded format using sinusoidal encoding. -- **Parameters**: - - `coords_tuple`: Tuple containing deltaX and deltaY. -- **Returns**: - - High-dimensional vector representing the embedded coordinates. - -#### `cal_pos_enc_output_dim()` -Calculates the output dimension of the position-encoded spatial relationship. -- **Returns**: - - The dimension of the encoded spatial relation embedding. - -#### `cal_freq_list()` -Calculates the list of frequencies used for the sinusoidal encoding based on the initialization method specified. -- **Modifies**: - - Internal frequency list based on the maximum and minimum radii and the total number of frequencies. - -#### `cal_freq_mat()` -Generates a matrix of frequencies to be used for batch processing of spatial data. -- **Modifies**: - - Internal frequency matrix to match the dimensions required for vectorized operations. - -#### `make_output_embeds(coords)` -Processes a batch of coordinates and converts them into spatial relation embeddings. -- **Parameters**: - - `coords`: Batch of spatial differences. -- **Returns**: - - Batch of spatial relation embeddings in high-dimensional space. - -## Usage Example -```python -# Initialize the encoder -encoder = SphereSpatialRelationLocationEncoder( - spa_embed_dim=64, - coord_dim=2, - frequency_num=16, - max_radius=10000, - min_radius=10, - freq_init="geometric", - device="cuda", - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="SphereSpatialRelationEncoder" -) - -# Sample coordinates -coords = np.array([[34.0522, -118.2437], [40.7128, -74.0060]]) # Example: [latitude, longitude] - -# Generate spatial embeddings -embeddings = encoder.forward(coords) diff --git a/documentation/Encoders/Sphere2Vec-sphereM+.md b/documentation/Encoders/Sphere2Vec-sphereM+.md deleted file mode 100644 index 00f1fbc2..00000000 --- a/documentation/Encoders/Sphere2Vec-sphereM+.md +++ /dev/null @@ -1,98 +0,0 @@ -# SphereGridMixScaleSpatialRelationLocationEncoder Documentation - -## Overview -The `SphereGridMixScaleSpatialRelationLocationEncoder` is engineered for advanced spatial encoding, integrating a position encoder that leverages geometrically scaled sinusoidal functions. It processes these encodings through a multi-layer feed-forward neural network to create detailed spatial embeddings. - -## Features -- **Position Encoding (`self.position_encoder`)**: Uses the `SphereGridMixScaleSpatialRelationPositionEncoder` to perform multi-scale sinusoidal encoding of spatial differences. -- **Feed-Forward Neural Network (`self.ffn`)**: Converts the position-encoded data into high-dimensional spatial embeddings through several neural network layers. - -## Configuration Parameters -- **spa_embed_dim**: The dimensionality of the spatial embeddings output. -- **coord_dim**: The dimensionality of the coordinate space. -- **frequency_num**: Number of frequency components used in positional encoding. -- **max_radius**: Maximum spatial context radius the encoder can handle. -- **min_radius**: Minimum radius for encoding, affecting the granularity of details captured. -- **freq_init**: Frequency initialization method, set to 'geometric'. -- **device**: Computation device, e.g., 'cuda'. -- **ffn_act**: Activation function used in the neural network layers. -- **ffn_num_hidden_layers**: Number of layers in the feed-forward network. -- **ffn_dropout_rate**: Dropout rate to prevent overfitting. -- **ffn_hidden_dim**: Dimension of each hidden layer in the network. -- **ffn_use_layernormalize**: Flag to enable layer normalization in the network. -- **ffn_skip_connection**: Flag to enable skip connections in the network. -- **ffn_context_str**: Context string for detailed logging and debugging within the network. - -## Methods -### `forward(coords)` -Processes input coordinates through the location encoder to produce detailed spatial embeddings. -- **Parameters**: - - **coords** (List or np.ndarray): Coordinates to process, formatted as `(batch_size, num_context_pt, coord_dim)`. -- **Returns**: - - **sprenc** (Tensor): Spatial relation embeddings, shaped `(batch_size, num_context_pt, spa_embed_dim)`. - -> ## SphereGridMixScaleSpatialRelationPositionEncoder - -### Overview -This position encoder transforms spatial coordinates using a sophisticated sinusoidal encoding method, featuring multiple scales to capture a wide range of spatial details. -

- sphereM-plus-transformation -

- -### Features -- **Multi-Scale Sinusoidal Encoding**: Applies sinusoidal functions at multiple scales to encode spatial differences, capturing a wide range of spatial details. -- **Geometric Frequency Scaling**: Frequencies increase geometrically, enhancing the encoder's ability to model spatial phenomena at various scales. -### Assumptions -- **Spatial Regularity**: Grid data often comes in regular, evenly spaced intervals, such as pixels in images or cells in raster GIS data. -- **Two-Dimensional Structure**: Most grid data is two-dimensional, requiring simultaneous encoding of both dimensions to capture spatial relationships effectively. - -### Configuration Parameters -- **coord_dim**: Dimensionality of the space being encoded. -- **frequency_num**: Total number of different sinusoidal frequencies used. -- **max_radius**: Largest spatial scale considered by the encoder. -- **min_radius**: Smallest spatial scale at which details are captured. -- **freq_init**: Method used to initialize the frequencies, typically 'geometric'. -- **device**: Computation device, such as 'cuda'. - -### Methods -#### `cal_elementwise_angle(coord, cur_freq)` -Calculates the angle for sinusoidal encoding based on the coordinate and the current frequency. -- **Parameters**: - - **coord**: Spatial difference, either deltaX or deltaY. - - **cur_freq**: Current frequency index. -- **Returns**: - - Computed angle for the sinusoidal transformation. - -#### `cal_coord_embed(coords_tuple)` -Converts a batch of coordinates into sinusoidally-encoded vectors. -- **Parameters**: - - **coords_tuple**: Tuple of deltaX and deltaY values. -- **Returns**: - - High-dimensional vector representing the encoded spatial relationships. - -#### `cal_output_dim()` -Calculates the dimensionality of the encoded spatial relation embeddings. -- **Returns**: - - Total dimensionality of the output spatial embeddings. - -## Usage Example -```python -encoder = SphereGridMixScaleSpatialRelationLocationEncoder( - spa_embed_dim=64, - coord_dim=2, - frequency_num=16, - max_radius=10000, - min_radius=10, - freq_init="geometric", - device="cuda", - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="SphereGridMixScaleSpatialRelationEncoder" -) - -coords = np.array([[34.0522, -118.2437], [40.7128, -74.0060]]) # Example coordinate data -embeddings = encoder.forward(coords) diff --git a/documentation/Encoders/Sphere2Vec-sphereM.md b/documentation/Encoders/Sphere2Vec-sphereM.md deleted file mode 100644 index 0bdf7fbb..00000000 --- a/documentation/Encoders/Sphere2Vec-sphereM.md +++ /dev/null @@ -1,125 +0,0 @@ -# SphereMixScaleSpatialRelationLocationEncoder Documentation - -## Overview -The `SphereMixScaleSpatialRelationLocationEncoder` is engineered to encode spatial relationships between locations using advanced position encoding techniques. It integrates the `SphereMixScaleSpatialRelationPositionEncoder` for initial encoding and processes the results through a multi-layer feed-forward neural network to produce high-dimensional spatial embeddings. - -## Features -- **Position Encoding (`self.position_encoder`)**: Utilizes the `SphereMixScaleSpatialRelationPositionEncoder` to encode spatial differences (deltaX, deltaY) using geometrically scaled sinusoidal functions. -- **Feed-Forward Neural Network (`self.ffn`)**: Transforms position-encoded data through several neural network layers to produce high-dimensional spatial embeddings. - -## Configuration Parameters -- **spa_embed_dim**: The dimensionality of the output spatial embeddings. -- **coord_dim**: The dimensionality of the coordinate space, typically 2D. -- **device**: Specifies the computation device, e.g., 'cuda'. -- **frequency_num**: Number of frequency components used in positional encoding. -- **max_radius**: The largest spatial context radius the model can handle. -- **min_radius**: The minimum radius, ensuring detailed capture at smaller scales. -- **freq_init**: Initialization method for frequency calculation, set to 'geometric'. -- **ffn_act**: Activation function used in the MLP layers. -- **ffn_num_hidden_layers**: Number of layers in the feed-forward network. -- **ffn_dropout_rate**: Dropout rate for regularization within the MLP. -- **ffn_hidden_dim**: Dimension of each hidden layer within the MLP. -- **ffn_use_layernormalize**: Boolean to enable normalization within the MLP. -- **ffn_skip_connection**: Enables skip connections within the MLP, potentially enhancing learning. -- **ffn_context_str**: Context string for debugging and detailed logging within the network. - -## Methods -### `forward(coords)` -Processes input coordinates through the location encoder to generate final spatial embeddings. -- **Parameters**: - - `coords` (List or np.ndarray): Coordinates to process, formatted as `(batch_size, num_context_pt, coord_dim)`. -- **Returns**: - - `sprenc` (Tensor): Spatial relation embeddings with a shape of `(batch_size, num_context_pt, spa_embed_dim)`. - -> ## SphereMixScaleSpatialRelationPositionEncoder - -### Overview -Transforms spatial coordinates into high-dimensional encoded formats using sinusoidal functions scaled across multiple frequencies, enhancing the model's capability to discern spatial nuances. - -### Assumptions for Grid-Structured Data - -#### Spatial Regularity -Grid data often comes in regular, evenly spaced intervals, such as pixels in images or cells in raster GIS data. - -#### Two-Dimensional Structure -Most grid data is two-dimensional, requiring simultaneous encoding of both dimensions to capture spatial relationships effectively. - -### Formula Development - -#### Base Sinusoidal Encoding -For each coordinate component $x$ and $y$, apply sinusoidal functions across multiple scales: - -$E(x, y) = \bigoplus{}^{L-1}_{i=0} \left\[ \sin(\omega_i x), \cos(\omega_i x), \sin(\omega_i y), \cos(\omega_i y) \right\]$ - -Where: -- $\bigoplus$ denotes vector concatenation. -- $L$ is the number of different frequencies used. -- $\omega_i$ are the scaled frequencies. - -#### Frequency Scaling -Given the grid structure, frequency scaling might be adapted based on typical distances or resolutions encountered in grid data: - -$\omega_i = \pi \cdot \left(\frac{2^i}{\text{cell size}}\right)$ - -This scaling method aligns the frequency increments with the spatial resolution of grid cells, allowing the encoder to capture variations within and between cells. - -#### Enhanced Spatial Encoding -To account for the two-dimensional nature of grid data and potentially the interactions between grid cells, the encoding can be expanded to include mixed terms that combine $x$ and $y$ coordinates: - -$E_{\text{enhanced}}(x, y) = E(x, y) \oplus \left\[\sin(\omega_i x) \cdot \cos(\omega_i y), \cos(\omega_i x) \cdot \sin(\omega_i y)\right\]$ - -These mixed terms help to model cross-dimensional spatial interactions, which are critical in grid-like structures where horizontal and vertical relationships might influence the spatial analysis. - -#### Output Dimensionality -The output dimensionality, considering the enhanced encoding, becomes: - -$\text{Output Dim} = 4L + 2L = 6L$ - -Where $4L$ comes from the original sinusoidal terms for $x$ and $y$, and $2L$ from the mixed terms added for cross-dimensional interactions. - -### Features -- **Geometric Frequency Scaling**: Employs a geometric progression of frequencies for sinusoidal encoding, capturing a broad range of spatial details. -- **Configurable Parameters**: Supports adjustments in encoding dimensions, frequency range, and computational resources. - -### Configuration Parameters -- **coord_dim**: The dimensionality of the space being encoded. -- **frequency_num**: The number of frequencies used for encoding. -- **device**: Specifies the computational device. - -### Methods -#### `cal_elementwise_angle(coord, cur_freq)` -Calculates the angle for sinusoidal encoding based on the coordinate and the current frequency. -- **Parameters**: - - `coord`: The deltaX or deltaY. - - `cur_freq`: The frequency index. -- **Returns**: - - The calculated angle for the sinusoidal transformation. - -#### `cal_coord_embed(coords_tuple)` -Converts a batch of coordinates into sinusoidally-encoded vectors. -- **Parameters**: - - `coords_tuple`: Tuple of spatial differences. -- **Returns**: - - High-dimensional vector representing the encoded spatial relationships. - -## Usage Example -```python -encoder = SphereMixScaleSpatialRelationLocationEncoder( - spa_embed_dim=64, - coord_dim=2, - device="cuda", - frequency_num=16, - max_radius=10000, - min_radius=10, - freq_init="geometric", - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="SphereMixScaleSpatialRelationEncoder" -) - -coords = np.array([[34.0522, -118.2437], [40.7128, -74.0060]]) # Example coordinate data -embeddings = encoder.forward(coords) diff --git a/documentation/Encoders/rbf.md b/documentation/Encoders/rbf.md deleted file mode 100644 index 51bd3376..00000000 --- a/documentation/Encoders/rbf.md +++ /dev/null @@ -1,140 +0,0 @@ -# RBFSpatialRelationLocationEncoder Documentation - -## Overview -The `RBFSpatialRelationLocationEncoder` is designed to process spatial relations between locations using Radial Basis Function (RBF) principles adapted for spatial encoding. It utilizes the `RBFSpatialRelationPositionEncoder` to transform spatial coordinates into a high-dimensional space, enhancing the model's ability to capture and interpret spatial relationships across various scales. - -## Features -- **Position Encoding (`self.position_encoder`)**: Utilizes `RBFSpatialRelationPositionEncoder` for transforming spatial differences into RBF-based representations. -- **Feed-Forward Neural Network (`self.ffn`)**: Processes the RBF-based data through a multi-layer neural network to generate final spatial embeddings. - -## Configuration Parameters -- `spa_embed_dim`: Dimensionality of the output spatial embeddings. -- `train_locs`: Training locations used to sample RBF anchor points. -- `model_type`: Type of the model, either 'global' or 'relative'. -- `coord_dim`: Dimensionality of the coordinate space (typically 2D). -- `device`: Computation device used (e.g., 'cuda' for GPU acceleration). -- `num_rbf_anchor_pts`: Number of RBF anchor points. -- `rbf_kernel_size`: Size of the RBF kernel. -- `rbf_kernel_size_ratio`: Ratio used to adjust the kernel size based on the distance from the origin (applied in relative models). -- `max_radius`: Maximum distance considered for spatial interactions. -- `rbf_anchor_pt_ids`: IDs of the RBF anchor points. -- `ffn_act`: Activation function for the neural network layers. -- `ffn_num_hidden_layers`: Number of hidden layers in the neural network. -- `ffn_dropout_rate`: Dropout rate to prevent overfitting during training. -- `ffn_hidden_dim`: Dimension of each hidden layer within the network. -- `ffn_use_layernormalize`: Whether to use layer normalization. -- `ffn_skip_connection`: Whether to include skip connections within the network layers. -- `ffn_context_str`: Context string for debugging and detailed logging within the network. - -## Methods -### `forward(coords)` -- **Purpose**: Processes input coordinates through the encoder to produce spatial embeddings. -- **Parameters**: - - `coords` (List or np.ndarray): Coordinates to process, formatted as `(batch_size, num_context_pt, coord_dim)`. -- **Returns**: - - `sprenc` (Tensor): The final spatial relation embeddings, shaped `(batch_size, num_context_pt, spa_embed_dim)`. - -> ## RBFSpatialRelationPositionEncoder - -### Overview -This position encoder leverages Radial Basis Function (RBF) techniques to encode spatial coordinates, enabling the model to recognize patterns and relationships that are not immediately apparent in the spatial domain. -### Theory - -#### Radial Basis Function (RBF) Encoding - -An RBF is a real-valued function whose value depends only on the distance from a center point, called an anchor point. -The RBF commonly used is the Gaussian function, which measures the similarity between data points based on their Euclidean distance. - -#### Gaussian RBF Kernel - -The Gaussian RBF kernel is defined as: - -$K(x, y) = \exp\left(-\frac{\|x - y\|^2}{2\sigma^2}\right)$ - -where $\|x - y\|$ is the Euclidean distance between points $x$ and $y$, and $\sigma$ is the kernel size (also called the bandwidth). - -#### Adaptive Kernel Sizes - -In some models, the kernel size $\sigma$ can vary based on the distance from the origin or another reference point. - -### Formulas - -#### Distance Calculation - -For each coordinate $(x, y)$ and each RBF anchor point $(a_i, b_i)$, the Euclidean distance is calculated as: - -$d_i = \sqrt{(x-a_i)^2 + (y-b_i)^2}$ - -#### Gaussian RBF Encoding - -The RBF encoding for each distance $d_i$ with kernel size $\sigma_i$ is calculated as: - -$\text{RBF}_i = \exp\left(-\frac{d_i^2}{2\sigma_i^2}\right)$ - -If a kernel size ratio is applied, $\sigma_i$ may be adjusted based on the distance from the origin: - -$\sigma_i = d_i \times \text{rbf-kernel-size-ratio} + \text{rbf-kernel-size}$ - -### Features -- **RBF Encoding**: Transforms spatial data into an RBF-based representation, capturing inherent spatial patterns effectively. -- **Adaptive Kernel Sizes**: Allows the kernel sizes to adapt based on the distance from the origin in relative models. - -### Configuration Parameters -- `model_type`: Type of the model, either 'global' or 'relative'. -- `train_locs`: Training locations used to sample RBF anchor points. -- `coord_dim`: Dimensionality of the space being encoded. -- `num_rbf_anchor_pts`: Number of different RBF anchor points used in the encoding. -- `rbf_kernel_size`: The RBF kernel size. -- `rbf_kernel_size_ratio`: Ratio used to adjust the kernel size based on the distance from the origin (applied in relative models). -- `max_radius`: The maximum effective radius for the encoding. -- `rbf_anchor_pt_ids`: IDs of the RBF anchor points. -- `device`: Specifies the computation device. - -### Methods - -#### `cal_elementwise_angle(coord, cur_freq)` -- **Description**: Calculates the angle for each frequency based on the spatial coordinate. -- **Parameters**: - - `coord`: Spatial difference, either deltaX or deltaY. - - `cur_freq`: Current frequency index. -- **Returns**: - - Computed angle for the transformation. - -#### `cal_coord_embed(coords_tuple)` -- **Description**: Encodes a set of coordinates into their frequency domain representations. -- **Parameters**: - - `coords_tuple`: A tuple of spatial differences. -- **Returns**: - - High-dimensional vector representing the frequency domain embeddings. - -#### `make_output_embeds(coords)` -- **Description**: Converts input spatial data into a comprehensive set of frequency domain features. -- **Parameters**: - - `coords`: Spatial coordinates to encode. -- **Returns**: - - High-dimensional embeddings that represent the input data in the frequency domain. - -## Usage Example -```python -# Initialize the encoder -encoder = RBFSpatialRelationLocationEncoder( - spa_embed_dim=64, - train_locs=np.array([[34.0522, -118.2437], [40.7128, -74.0060]]), # Example train_locs - model_type='global', - coord_dim=2, - device="cuda", - num_rbf_anchor_pts=100, - rbf_kernel_size=10e2, - rbf_kernel_size_ratio=0.0, - max_radius=10000, - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="RBFSpatialRelationEncoder" -) - -coords = np.array([[34.0522, -118.2437], [40.7128, -74.0060]]) # Example coordinate data -embeddings = encoder.forward(coords) diff --git a/documentation/Encoders/rff.md b/documentation/Encoders/rff.md deleted file mode 100644 index 1fcf714e..00000000 --- a/documentation/Encoders/rff.md +++ /dev/null @@ -1,108 +0,0 @@ -# RFFSpatialRelationLocationEncoder Documentation - -## Overview -The `RFFSpatialRelationLocationEncoder` is designed to process spatial relations between locations using Random Fourier Features (RFF) adapted for spatial encoding. It utilizes the `RFFSpatialRelationPositionEncoder` to transform spatial coordinates into a high-dimensional space, enhancing the model's ability to capture and interpret spatial relationships across various scales. - -## Features -- **Position Encoding (`self.position_encoder`)**: Utilizes `RFFSpatialRelationPositionEncoder` for transforming spatial differences into frequency-based representations using Random Fourier Features. -- **Feed-Forward Neural Network (`self.ffn`)**: Processes the RFF-based data through a multi-layer neural network to generate final spatial embeddings. - -## Configuration Parameters -- `spa_embed_dim`: Dimensionality of the output spatial embeddings. -- `coord_dim`: Dimensionality of the coordinate space (typically 2D). -- `frequency_num`: Number of frequency components used in the positional encoding. -- `rbf_kernel_size`: Size of the RBF kernel used in the generation of direction vectors. -- `extent`: The extent of the coordinate space (optional). -- `device`: Computation device used (e.g., 'cuda' for GPU acceleration). -- `ffn_act`: Activation function for the neural network layers. -- `ffn_num_hidden_layers`: Number of hidden layers in the neural network. -- `ffn_dropout_rate`: Dropout rate to prevent overfitting during training. -- `ffn_hidden_dim`: Dimension of each hidden layer within the network. -- `ffn_use_layernormalize`: Whether to use layer normalization. -- `ffn_skip_connection`: Whether to include skip connections within the network layers. -- `ffn_context_str`: Context string for debugging and detailed logging within the network. - -## Methods -### `forward(coords)` -- **Purpose**: Processes input coordinates through the encoder to produce spatial embeddings. -- **Parameters**: - - `coords` (List or np.ndarray): Coordinates to process, formatted as `(batch_size, num_context_pt, coord_dim)`. -- **Returns**: - - `sprenc` (Tensor): The final spatial relation embeddings, shaped `(batch_size, num_context_pt, spa_embed_dim)`. - -## RFFSpatialRelationPositionEncoder - -### Overview -The `RFFSpatialRelationPositionEncoder` leverages Random Fourier Features (RFF) to encode spatial coordinates into high-dimensional representations. This method is based on the paper "Random Features for Large-Scale Kernel Machines" and is particularly effective for approximating kernel functions. - -### Features -- **Random Fourier Feature Encoding**: Transforms spatial data into a frequency-based representation, capturing inherent spatial frequencies and patterns effectively. -- **Adaptable to Different Spatial Extents**: Can normalize input coordinates based on the provided spatial extent. - -## Theory - -### Random Fourier Feature (RFF) Encoding - -Random Fourier Features provide an approximation to shift-invariant kernel functions by mapping the input data into a randomized low-dimensional feature space. The key idea is to use random projections to approximate the kernel function. - -### Gaussian RBF Kernel Approximation - -The Gaussian RBF kernel is defined as: -$K(x, y) = \exp\left(-\frac{\|x - y\|^2}{2\sigma^2}\right)$ -where $\|x - y\|$ is the Euclidean distance between points $x$ and $y$, and$\sigma$ is the kernel size (bandwidth). - -### Random Fourier Features - -Using Bochner's theorem, any shift-invariant kernel can be represented as the Fourier transform of a probability measure. For the Gaussian RBF kernel, the transformation is given by: -$z(x) = \sqrt{\frac{2}{D}} \cos(\omega^T x + b)$ -where$ \omega$ is drawn from a Gaussian distribution, $b$ is drawn from a uniform distribution, and $D$ is the dimension of the feature space. - -### Formulas - -1. **Generate Direction and Shift Vectors**: - - Direction vector $\omega$: - $\omega \sim \mathcal{N}(0, \sigma^2 I)$ - - Shift vector $b$: - - $b \sim \text{Uniform}(0, 2\pi)$ - -2. **Random Fourier Feature Transformation**: - - $z(x) = \sqrt{\frac{2}{D}} \cos(\omega^T x + b)$ - -### Implementation - -#### `generate_direction_vector()` -- **Purpose**: Generates the direction (omega) and shift (b) vectors used in the RFF transformation. -- **Returns**: - - `dirvec`: Direction vectors. - - `shift`: Shift vectors. - -#### `make_output_embeds(coords)` -- **Purpose**: Converts input coordinates into RFF-based high-dimensional embeddings. -- **Parameters**: - - `coords`: Input coordinates. -- **Returns**: - - High-dimensional embeddings representing the input data in the RFF feature space. - -## Usage Example -```python -# Initialize the encoder -encoder = RFFSpatialRelationLocationEncoder( - spa_embed_dim=64, - coord_dim=2, - frequency_num=16, - rbf_kernel_size=1.0, - extent=None, - device="cuda", - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="RFFSpatialRelationEncoder" -) - -coords = np.array([[34.0522, -118.2437], [40.7128, -74.0060]]) # Example coordinate data -embeddings = encoder.forward(coords) diff --git a/documentation/Encoders/tile-ffn.md b/documentation/Encoders/tile-ffn.md deleted file mode 100644 index 35bbee12..00000000 --- a/documentation/Encoders/tile-ffn.md +++ /dev/null @@ -1,96 +0,0 @@ -# GridLookupSpatialRelationLocationEncoder - -## Overview -The `GridLookupSpatialRelationLocationEncoder` is designed to process spatial relations between locations using a grid-based lookup approach for spatial encoding. It utilizes the `GridLookupSpatialRelationPositionEncoder` to transform spatial coordinates into a high-dimensional space, enhancing the model's ability to capture and interpret spatial relationships across various scales. - -## Features -- **Position Encoding (`self.position_encoder`)**: Utilizes `GridLookupSpatialRelationPositionEncoder` for transforming spatial differences into grid-based representations. -- **Feed-Forward Neural Network (`self.ffn`)**: Processes the grid-based data through a multi-layer neural network to generate final spatial embeddings. - -## Configuration Parameters -- `spa_embed_dim`: Dimensionality of the output spatial embeddings. -- `extent`: The extent of the coordinate space (x_min, x_max, y_min, y_max). -- `interval`: The cell size in X and Y direction. -- `coord_dim`: Dimensionality of the coordinate space (typically 2D). -- `device`: Computation device used (e.g., 'cuda' for GPU acceleration). -- `ffn_act`: Activation function for the neural network layers. -- `ffn_num_hidden_layers`: Number of hidden layers in the neural network. -- `ffn_dropout_rate`: Dropout rate to prevent overfitting during training. -- `ffn_hidden_dim`: Dimension of each hidden layer within the network. -- `ffn_use_layernormalize`: Whether to use layer normalization. -- `ffn_skip_connection`: Whether to include skip connections within the network layers. -- `ffn_context_str`: Context string for debugging and detailed logging within the network. - -## Methods -### `forward(coords)` -- **Purpose**: Processes input coordinates through the encoder to produce spatial embeddings. -- **Parameters**: - - `coords` (List or np.ndarray): Coordinates to process, formatted as `(batch_size, num_context_pt, coord_dim)`. -- **Returns**: - - `sprenc` (Tensor): The final spatial relation embeddings, shaped `(batch_size, num_context_pt, spa_embed_dim)`. - -> ## GridLookupSpatialRelationPositionEncoder - -### Overview -The `GridLookupSpatialRelationPositionEncoder` divides the space into grids and assigns each point to the grid embedding it falls into. This method enables the model to use a grid-based representation for spatial encoding. - -### Features -- **Grid-Based Encoding**: Transforms spatial data into a grid-based representation, capturing spatial patterns effectively. -- **Adaptable to Different Spatial Extents**: Can normalize input coordinates based on the provided spatial extent. - -## Theory - -### Grid-Based Encoding - -Grid-based encoding divides the spatial extent into equal-sized cells (grids) and assigns each coordinate to a specific grid cell. Each grid cell has an embedding that represents the spatial location. - -### Formulas - -1. **Grid Cell Calculation**: - - Calculate the column and row indices for each coordinate based on the grid interval and extent. - - $\text{col} = \left\lfloor \frac{x - x_{\text{min}}}{\text{interval}} \right\rfloor$ - - $\text{row} = \left\lfloor \frac{y - y_{\text{min}}}{\text{interval}} \right\rfloor$ - -2. **Grid Cell Index**: - - Calculate the unique index for each grid cell. -$\text{index} = \text{row} \times \text{num-cols} + \text{col}$ - -### Implementation - -#### `make_grid_embedding(interval, extent)` -- **Purpose**: Creates grid embeddings for the specified interval and extent. -- **Parameters**: - - `interval`: The cell size in X and Y direction. - - `extent`: The extent of the coordinate space. -- **Returns**: - - Grid embeddings for the specified interval and extent. - -#### `make_output_embeds(coords)` -- **Purpose**: Converts input coordinates into grid-based high-dimensional embeddings. -- **Parameters**: - - `coords`: Input coordinates. -- **Returns**: - - High-dimensional embeddings representing the input data in the grid-based feature space. - -## Usage Example -```python -# Initialize the encoder -encoder = GridLookupSpatialRelationLocationEncoder( - spa_embed_dim=64, - extent=(-180, 180, -90, 90), # Example extent - interval=1000000, # Example interval - coord_dim=2, - device="cuda", - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="GridLookupSpatialRelationEncoder" -) - -coords = np.array([[34.0522, -118.2437], [40.7128, -74.0060]]) # Example coordinate data -embeddings = encoder.forward(coords) diff --git a/documentation/Encoders/wrap-ffn.md b/documentation/Encoders/wrap-ffn.md deleted file mode 100644 index 17b86902..00000000 --- a/documentation/Encoders/wrap-ffn.md +++ /dev/null @@ -1,94 +0,0 @@ -# AodhaFFNSpatialRelationLocationEncoder - -## Overview -The `AodhaFFNSpatialRelationLocationEncoder` is designed to process spatial relations between locations using Fourier Feature Transform (FFT) adapted for spatial encoding. It utilizes the `AodhaFFTSpatialRelationPositionEncoder` to transform spatial coordinates into a high-dimensional space, enhancing the model's ability to capture and interpret spatial relationships across various scales. - -## Features -- **Position Encoding (`self.position_encoder`)**: Utilizes `AodhaFFTSpatialRelationPositionEncoder` for transforming spatial differences into frequency-based representations using Fourier Feature Transform. -- **Feed-Forward Neural Network (`self.ffn`)**: Processes the FFT-based data through a multi-layer neural network to generate final spatial embeddings. - -## Configuration Parameters -- `spa_embed_dim`: Dimensionality of the output spatial embeddings. -- `extent`: The extent of the coordinate space (x_min, x_max, y_min, y_max). -- `coord_dim`: Dimensionality of the coordinate space (typically 2D). -- `do_pos_enc`: Whether to perform position encoding. -- `do_global_pos_enc`: Whether to normalize coordinates globally. -- `device`: Computation device used (e.g., 'cuda' for GPU acceleration). -- `ffn_act`: Activation function for the neural network layers. -- `ffn_num_hidden_layers`: Number of hidden layers in the neural network. -- `ffn_dropout_rate`: Dropout rate to prevent overfitting during training. -- `ffn_hidden_dim`: Dimension of each hidden layer within the network. -- `ffn_use_layernormalize`: Whether to use layer normalization. -- `ffn_skip_connection`: Whether to include skip connections within the network layers. -- `ffn_context_str`: Context string for debugging and detailed logging within the network. - -## Methods -### `forward(coords)` -- **Purpose**: Processes input coordinates through the encoder to produce spatial embeddings. -- **Parameters**: - - `coords` (List or np.ndarray): Coordinates to process, formatted as `(batch_size, num_context_pt, coord_dim)`. -- **Returns**: - - `sprenc` (Tensor): The final spatial relation embeddings, shaped `(batch_size, num_context_pt, spa_embed_dim)`. - -## AodhaFFTSpatialRelationPositionEncoder - -### Overview -The `AodhaFFTSpatialRelationPositionEncoder` leverages Fourier Feature Transform (FFT) to encode spatial coordinates into high-dimensional representations. This method divides the space into grids and uses grid embeddings to represent the spatial relations. - -### Features -- **Fourier Feature Transform Encoding**: Transforms spatial data into a frequency-based representation, capturing inherent spatial frequencies and patterns effectively. -- **Adaptable to Different Spatial Extents**: Can normalize input coordinates based on the provided spatial extent. - -## Theory - -### Fourier Feature Transform (FFT) Encoding - -Fourier Feature Transform provides an approximation to continuous signals by mapping the input data into a randomized low-dimensional feature space using sine and cosine functions. - -### Fourier Transform Basis Functions - -The Fourier Transform uses sine and cosine functions as basis functions to represent the original signal. In spatial encoding, this can be represented as: -$\text{sin}(\pi \cdot x), \text{cos}(\pi \cdot x), \text{sin}(\pi \cdot y), \text{cos}(\pi \cdot y)$ - -### Formulas - -1. **Normalization**: - - Normalize coordinates based on the extent of the coordinate space. - - $x_{\text{norm}} = \frac{x - x_{\min}}{x_{\max} - x_{\min}}$ - - $y_{\text{norm}} = \frac{y - y_{\min}}{y_{\max} - y_{\min}}$ - -2. **Fourier Feature Transform**: - - Apply sine and cosine functions to the normalized coordinates: $\[\text{sin}(\pi \cdot x_{\text{norm}}), \text{cos}(\pi \cdot x_{\text{norm}}),$ $\text{sin}(\pi \cdot y_{\text{norm}}), \text{cos}(\pi \cdot y_{\text{norm}})\]$ - -### Implementation - -#### `make_output_embeds(coords)` -- **Purpose**: Converts input coordinates into FFT-based high-dimensional embeddings. -- **Parameters**: - - `coords`: Input coordinates. -- **Returns**: - - High-dimensional embeddings representing the input data in the FFT feature space. - -## Usage Example -```python -# Initialize the encoder -encoder = AodhaFFNSpatialRelationLocationEncoder( - spa_embed_dim=64, - extent=(0, 100, 0, 100), # Example extent - coord_dim=2, - do_pos_enc=True, - do_global_pos_enc=True, - device="cuda", - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="AodhaFFTSpatialRelationEncoder" -) - -coords = np.array([[34.0522, -118.2437], [40.7128, -74.0060]]) # Example coordinate data -embeddings = encoder.forward(coords) diff --git a/documentation/Encoders/wrap.md b/documentation/Encoders/wrap.md deleted file mode 100644 index bd6c5cc7..00000000 --- a/documentation/Encoders/wrap.md +++ /dev/null @@ -1,45 +0,0 @@ -# FCNet Documentation - -## Overview -The `FCNet` class is a fully connected neural network designed for classification tasks. It consists of multiple residual layers to enhance feature extraction and can classify input data into multiple categories. - -## Features -- **Fully Connected Layers**: Utilizes linear layers to transform input data. -- **Residual Layers**: Includes residual layers to improve feature extraction. -- **Class Embedding**: Embeds the extracted features into the desired number of classes. -- **User Embedding**: Optionally embeds features into user-specific embeddings. - -## Configuration Parameters -- `num_inputs`: Dimensionality of the input embedding. -- `num_classes`: Number of categories for classification. -- `num_filts`: Dimensionality of the hidden embeddings. -- `num_users`: Number of user-specific embeddings (default is 1). - -## Methods -### `forward(x, class_of_interest=None, return_feats=False)` -- **Purpose**: Processes input features through the network and returns class predictions or intermediate feature embeddings. -- **Parameters**: - - `x` (torch.FloatTensor): Input location features with shape `(batch_size, input_loc_dim)`. - - `class_of_interest` (int, optional): Class ID for specific class evaluation. - - `return_feats` (bool, optional): Whether to return only the intermediate feature embeddings. -- **Returns**: - - If `return_feats` is True, returns the feature embeddings with shape `(batch_size, num_filts)`. - - If `class_of_interest` is specified, returns the sigmoid output for the specific class with shape `(batch_size)`. - - Otherwise, returns the sigmoid class predictions for all classes with shape `(batch_size, num_classes)`. - -### `eval_single_class(x, class_of_interest)` -- **Purpose**: Evaluates the network output for a specific class. -- **Parameters**: - - `x` (torch.FloatTensor): Feature embeddings with shape `(batch_size, num_filts)`. - - `class_of_interest` (int): Class ID for evaluation. -- **Returns**: - - Sigmoid output for the specified class with shape `(batch_size)`. - - - -### User Embedding -The user embedding layer maps the hidden features to user-specific embeddings: - -$\mathbf{y} _{\text{user}} = \mathbf{W} _{\text{user}} \mathbf{h}$ - -where $\mathbf{W}_{\text{user}}$ is the user weight matrix. diff --git a/documentation/Encoders/xyz.md b/documentation/Encoders/xyz.md deleted file mode 100644 index 21098008..00000000 --- a/documentation/Encoders/xyz.md +++ /dev/null @@ -1,80 +0,0 @@ -# Space2Vec-xyz (XYZSpatialRelationLocationEncoder) - -## Overview -The `XYZSpatialRelationLocationEncoder` is designed for encoding spatial relations between locations. This encoder integrates a position encoding strategy, leveraging an [XYZSpatialRelationPositionEncoder](#XYZSpatialRelationPositionEncoder), and further processes the encoded positions through a customizable multi-layer feed-forward neural network. - -## Features -- **Position Encoding (`self.position_encoder`):** Utilizes the [XYZSpatialRelationPositionEncoder](#XYZSpatialRelationPositionEncoder) to encode spatial differences (deltaX, deltaY) based on sinusoidal functions. -- **Feed-Forward Neural Network (`self.ffn`):** Transforms the position-encoded data through a series of feed-forward layers to produce high-dimensional spatial embeddings. - -## Configuration Parameters -- `spa_embed_dim`: Dimensionality of the spatial embedding output. -- `coord_dim`: Dimensionality of the coordinate space (e.g., 2D, 3D). -- `device`: Computation device (e.g., 'cuda' for GPU). -- `ffn_act`: Activation function for the feed-forward layers. -- `ffn_num_hidden_layers`: Number of hidden layers in the feed-forward network. -- `ffn_dropout_rate`: Dropout rate used in the feed-forward network. -- `ffn_hidden_dim`: Dimension of each hidden layer in the feed-forward network. -- `ffn_use_layernormalize`: Flag to enable layer normalization in the feed-forward network. -- `ffn_skip_connection`: Flag to enable skip connections in the feed-forward network. -- `ffn_context_str`: Context string for debugging and detailed logging within the network. - -## Methods -### `forward(coords)` -- **Purpose:** Processes input coordinates through the location encoder to produce final spatial embeddings. -- **Parameters:** - - `coords` (List or np.ndarray): Coordinates to process, expected to be in the form `(batch_size, num_context_pt, coord_dim)`. -- **Returns:** - - `sprenc` (Tensor): Spatial relation embeddings with a shape of `(batch_size, num_context_pt, spa_embed_dim)`. - -> ## XYZSpatialRelationPositionEncoder - -### Features -- **Sinusoidal Encoding:** Utilizes sinusoidal functions to encode spatial differences, allowing for the representation of these differences in a form that neural networks can more effectively learn from. -- **Configurable Parameters:** Allows customization of encoding parameters such as the dimensionality of space and computational device. - -### Configuration Parameters -- **coord_dim:** Dimensionality of the space being encoded (e.g., 2D, 3D). -- **device:** Specifies the computational device, e.g., 'cuda' for GPU acceleration. - -### Methods - -#### `make_output_embeds(coords)` -Processes a batch of coordinates and converts them into spatial relation embeddings. -- **Parameters:** - - `coords`: Batch of spatial differences. -- **Formulas:** - - Convert latitude `lat` and longitude `lon` coordinates into radians. - - Calculate `x, y, z` coordinates using the following equations: -

- xyz-transformation -

- - Concatenate `x, y, z` coordinates to form the high-dimensional vector representation. - -- **Returns:** - - Batch of spatial relation embeddings in high-dimensional space. - -#### `forward(coords)` -Feeds the processed coordinates through the encoder to produce final spatial embeddings. -- **Parameters:** - - `coords`: Coordinates to process. -- **Returns:** - - Tensor of spatial relation embeddings. - -## Usage Example -```python -encoder = XYZSpatialRelationLocationEncoder( - spa_embed_dim=64, - coord_dim=2, - device="cuda", - ffn_act="relu", - ffn_num_hidden_layers=1, - ffn_dropout_rate=0.5, - ffn_hidden_dim=256, - ffn_use_layernormalize=True, - ffn_skip_connection=True, - ffn_context_str="XYZSpatialRelationEncoder" -) - -coords = np.array([...]) # your coordinate data -embeddings = encoder.forward(coords) diff --git a/documentation/documentation.html b/documentation/documentation.html deleted file mode 100644 index 992dc3f3..00000000 --- a/documentation/documentation.html +++ /dev/null @@ -1,758 +0,0 @@ - - - - - - - - - - - - - -documentation - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- - - - - - - -
-

1. Single point location encoder

-

-Location Encoder Structure -

-
-

1.1 EncoderMultiLayerFeedForwardNN()

-

NN(.) : ℝ^W -> ℝ^d is a learnable neural network -component which maps the input position embedding -PE(x) ∈ ℝ^W into the location embedding -Enc(x) ∈ ℝ^d. A common practice is to define -NN(.) as a multi-layer perceptron, while Mac Aodha et -al. (2019) adopted a more complex NN(.) which includes an -initial fully connected layer, followed by a series of residual blocks. -The purpose of NN(.) is to provide a learnable component -for the location encoder, which captures the complex interaction between -input locations and target labels.

-
-

1.1.1 Properties

-
    -
  • input_dim (int): Dimensionality of the input -embeddings.
  • -
  • output_dim (int): Dimensionality of the output of the -network.
  • -
  • num_hidden_layers (int): The number of hidden layers in -the network. If set to 0, the network will be linear.
  • -
  • dropout_rate (float, optional): The dropout rate for -regularization. If None, dropout is not used.
  • -
  • hidden_dim (int): The size of each hidden layer. -Required if num_hidden_layers is greater than 0.
  • -
  • activation (str): The type of activation function to -use in the hidden layers. Common options are ‘sigmoid’, ‘tanh’, or -‘relu’.
  • -
  • use_layernormalize (bool): Determines whether to apply -layer normalization after each hidden layer.
  • -
  • skip_connection (bool): If set to True, enables skip -connections between layers.
  • -
  • context_str (str, optional): An optional string -providing context for this instance, such as indicating its role within -a larger model.
  • -
-
-
-

1.1.3 Methods

-
-

__init__(input_dim, output_dim, num_hidden_layers=0, dropout_rate=None, hidden_dim=-1, activation="sigmoid", use_layernormalize=False, skip_connection=False, context_str=None)

-

Constructor for the EncoderMultiLayerFeedForwardNN -class.

-
    -
  • Parameters: -
      -
    • input_dim (int): Dimensionality of the input -embeddings.
    • -
    • output_dim (int): Dimensionality of the output of the -network.
    • -
    • num_hidden_layers (int): Number of hidden layers in the -network, set to 0 for a linear network.
    • -
    • dropout_rate (float, optional): Dropout keep -probability.
    • -
    • hidden_dim (int): Size of the hidden layers.
    • -
    • activation (str): Activation function to use (‘tanh’ or -‘relu’).
    • -
    • use_layernormalize (bool): Whether to use layer -normalization.
    • -
    • skip_connection (bool): Whether to use skip -connections.
    • -
    • context_str (str, optional): Contextual string for the -encoder.
    • -
  • -
-
-
-

forward(input_tensor)

-

Defines the forward pass of the network.

-
    -
  • Parameters:

    -
      -
    • input_tensor (Tensor): A tensor with shape -[batch_size, ..., input_dim].
    • -
  • -
  • Returns: A tensor with shape -[batch_size, ..., output_dim]. Note that no non-linearity -is applied to the output.

  • -
  • Raises:

    -
      -
    • AssertionError: If the last dimension of -input_tensor does not match input_dim.
    • -
  • -
-
-
-
-
-

1.2 PositionEncoder()

-

PE(.) is the most important component which -distinguishes different Enc(x). Usually, PE(.) -is a deterministic function which transforms location x into a -W-dimension vector, so-called position embedding. The purpose of -PE(.) is to do location feature normalization (Chu et -al. 2019, Mac Aodha et al. 2019, Rao et al. 2020) and/or feature -decomposition (Mai et al. 2020b, Zhong et al. 2020) so that the output -PE(x) is more learning-friendly for NN(.). In -Table 1 we further classify different Enc(x) into four -sub-categories based on their PE(.): discretization-based, -direct, sinusoidal, and sinusoidal multi-scale location encoder. Each of -them will be discussed in detail below.

-
-

1.2.1 Properties

-
    -
  • spa_embed_dim (int): The dimension of the output -spatial relation embedding.
  • -
  • coord_dim (int): The dimensionality of space (e.g., 2 -for 2D, 3 for 3D).
  • -
  • frequency_num (int): The number of different -frequencies/wavelengths for the sinusoidal functions.
  • -
  • max_radius (float): The largest context radius the -model can handle.
  • -
  • min_radius (float): The smallest context radius -considered by the model.
  • -
  • freq_init (str): Method to initialize the frequency -list (‘random’ or ‘geometric’).
  • -
  • ffn (nn.Module, optional): A feedforward neural network -module to be applied to the embeddings.
  • -
  • device (str): The device to which tensors will be moved -(‘cuda’ or ‘cpu’).
  • -
-
-
-

1.2.2 Methods

-
-
-

get_activation_function(activation, context_str)

-
    -
  • Parameters: -
      -
    • activation: A string that specifies the type of -activation function to retrieve.
    • -
    • context_str: A string that provides context for the -error message if the activation function is not recognized.
    • -
  • -
  • Returns: An activation function object from the -torch.nn module.
  • -
  • Description: Retrieves an activation function -object based on the specified activation string. It -supports ‘leakyrelu’, ‘relu’, ‘sigmoid’, and ‘tanh’. If the specified -activation is not recognized, it raises an exception with a -context-specific error message.
  • -
  • Exceptions: Raises an Exception with -the message "{context_str} activation not recognized." if -the specified activation function is not one of the supported -options.
  • -
-
-

cal_freq_list(freq_init, frequency_num, max_radius, min_radius)

-
    -
  • Parameters: -
      -
    • freq_init: A string that specifies the initialization -method for frequencies (‘random’ or ‘geometric’).
    • -
    • frequency_num: An integer representing the number of -frequencies to generate.
    • -
    • max_radius: A float representing the maximum radius, -used as the upper bound for random initialization or the geometric -sequence’s start point.
    • -
    • min_radius: A float representing the minimum radius, -used as the geometric sequence’s end point.
    • -
  • -
  • Returns: A NumPy array freq_list -containing the list of frequencies initialized as per the method -specified by freq_init.
  • -
  • Description: Calculates a list of frequencies based -on the initialization method specified. If freq_init is -‘random’, it generates frequency_num random frequencies, -each multiplied by max_radius. If freq_init is -‘geometric’, it generates a list of frequencies based on a geometric -progression from min_radius to max_radius with -frequency_num elements.
  • -
  • Exceptions: None explicitly raised, but if -frequency_num is less than 1, it may cause an error in the -geometric initialization logic.
  • -
-
-
-

cal_freq_mat()

-

Generates a matrix of frequencies for encoding. - -Returns: A frequency matrix (np.array) for -use in positional encoding.

-
-
-

cal_input_dim()

-

Computes the dimension of the encoded spatial relation embedding -based on the frequency and coordinate dimensions. - -Returns: The input dimension (int) of the encoder.

-
-
-

cal_elementwise_angle(coord, cur_freq)

-

Calculates the angle for each coordinate and frequency, to be used in -the sinusoidal functions. - Parameters: - -coord: The coordinate value (deltaX or -deltaY). - cur_freq: The current frequency -being processed. - Returns: The calculated angle -(float).

-
-
-

cal_coord_embed(coords_tuple)

-

Encodes a tuple of coordinates into a sinusoidal embedding. - -Parameters: - coords_tuple: A tuple of -coordinate values. - Returns: A list of sinusoidal -embeddings (list).

-
-
-

forward(coords)

-

Abstract method for transforming spatial coordinates into embeddings. -Must be implemented by subclasses. - Parameters: - -coords: Spatial coordinates to encode. - -Raises: - NotImplementedError: If the -method is not overridden by a subclass.

-
-
-

visualize_embed_cosine

-

Visualizes the cosine similarity of embeddings on a 2D plot. - -Parameters: - embed: Embedding vector with -shape (spa_embed_dim, 1). - module: The model -module containing the embedding layers. - layername: -Specifies the layer name for which the embeddings are visualized -("input_emb" or "output_emb"). - -coords: Coordinates for the embeddings. - -extent: Extent of the plot area. - centerpt: -(Optional) The center point to highlight. - xy_list: -(Optional) List of points to plot. - pt_size: (Optional) -Size of the points. - polygon: (Optional) Polygon to -outline on the plot. - img_path: (Optional) Path to save -the plot image.

-
-
-

get_coords

-

Generates a grid of coordinates within a specified extent. - -Parameters: - extent: The bounding box for -the coordinate grid. - interval: The spacing between points -in the grid.

-
-
-

map_id2geo

-

Plots geographical locations based on their IDs. - -Parameters: - place2geo: A mapping from -place IDs to geographical coordinates.

-
-
-

visualize_encoder

-

Visualizes the output of an encoder layer for a given set of -coordinates. - Parameters: - module: The -model module containing the encoder. - layername: Specifies -the encoder layer ("input_emb" or -"output_emb"). - coords: Coordinates for -visualization. - extent: Extent of the plot area. - -num_ch: Number of channels to visualize. - -img_path: (Optional) Path to save the visualization.

-
-
-

spa_enc_embed_clustering

-

Performs spatial encoding embedding clustering and visualization. - -Parameters: - module: The model module to -use for forward pass. - num_cluster: Number of clusters for -the agglomerative clustering. - extent: Extent of the plot -area. - interval: Interval between points in the grid. - -coords: Coordinates for clustering. - -tsne_comp: Number of components for t-SNE reduction.

-
-
-

make_enc_map

-

Creates a map visualization based on encoder cluster labels. - -Parameters: - cluster_labels: Cluster -labels for each point in the grid. - num_cluster: Number of -clusters. - extent: Extent of the plot area. - -margin: Margin around the plot area. - -xy_list: (Optional) List of points to plot. - -polygon: (Optional) Polygon to outline on the plot. - -usa_gdf: (Optional) GeoDataFrame for the USA map. - -coords_color: (Optional) Color for the coordinates. - -colorbar: (Optional) Flag to display a color bar. - -img_path: (Optional) Path to save the map image. - -xlabel, ylabel: (Optional) Labels for the x -and y axes.

-
-
-

explode

-

Converts a GeoDataFrame with MultiPolygons into a GeoDataFrame with -Polygons. - Parameters: - indata: Input -GeoDataFrame or file path.

-
-
-

get_pts_in_box

-

Filters points within a specified bounding box. - -Parameters: - place2geo: A mapping from -place IDs to geographical coordinates. - extent: The -bounding box for filtering.

-
-
-

load_USA_geojson

-

Loads and projects the USA mainland GeoJSON to the EPSG:2163 -projection system. - Parameters: - -us_geojson_file: Path to the USA GeoJSON file.

-
-
-

get_projected_mainland_USA_states

-

Loads and projects mainland USA states from a GeoJSON file to the -EPSG:2163 projection system. - Parameters: - -us_states_geojson_file: Path to the USA states GeoJSON -file.

-
-
-

read2idIndexFile

-

Reads an entity or relation to ID mapping file. - -Parameters: - Index2idFilePath: Path to -the file containing the mappings.

-
-
-

reverse_dict

-

Reverses a dictionary mapping. - Parameters: - -iri2id: The dictionary to reverse.

-
-
-

get_node_mode

-

Determines the mode (type) of a node based on the provided mappings. -- Parameters: - node_maps: A mapping of -node types to their IDs. - node_id: The ID of the node to -determine the mode for.

-
-
-

path_embedding_compute

-

Computes the embedding for a path between nodes. - -Parameters: - path_dec: The path decoder. -- `

-
-
-
-
-
-

2. Aggregation location encoder

-

-Structure of Aggregation Location Encoder Structure -

-
- - - - -
- - - - - - - - - - - - - - - diff --git a/documentation/documentation.md b/documentation/documentation.md deleted file mode 100644 index c3fe45de..00000000 --- a/documentation/documentation.md +++ /dev/null @@ -1,227 +0,0 @@ ---- -output: - html_document: default - pdf_document: default ---- - -# 1. Single point location encoder -

- Location Encoder Structure -

- - -## 1.1 EncoderMultiLayerFeedForwardNN() -`NN(.) : ℝ^W -> ℝ^d` is a learnable neural network component which maps the input position embedding `PE(x) ∈ ℝ^W` into the location embedding `Enc(x) ∈ ℝ^d`. A common practice is to define `NN(.)` as a multi-layer perceptron, while Mac Aodha et al. (2019) adopted a more complex `NN(.)` which includes an initial fully connected layer, followed by a series of residual blocks. The purpose of `NN(.)` is to provide a learnable component for the location encoder, which captures the complex interaction between input locations and target labels. - -### 1.1.1 Properties - -- `input_dim` (int): Dimensionality of the input embeddings. -- `output_dim` (int): Dimensionality of the output of the network. -- `num_hidden_layers` (int): The number of hidden layers in the network. If set to 0, the network will be linear. -- `dropout_rate` (float, optional): The dropout rate for regularization. If None, dropout is not used. -- `hidden_dim` (int): The size of each hidden layer. Required if `num_hidden_layers` is greater than 0. -- `activation` (str): The type of activation function to use in the hidden layers. Common options are 'sigmoid', 'tanh', or 'relu'. -- `use_layernormalize` (bool): Determines whether to apply layer normalization after each hidden layer. -- `skip_connection` (bool): If set to True, enables skip connections between layers. -- `context_str` (str, optional): An optional string providing context for this instance, such as indicating its role within a larger model. - -### 1.1.3 Methods - -#### `__init__(input_dim, output_dim, num_hidden_layers=0, dropout_rate=None, hidden_dim=-1, activation="sigmoid", use_layernormalize=False, skip_connection=False, context_str=None)` -Constructor for the `EncoderMultiLayerFeedForwardNN` class. - -- **Parameters**: - - `input_dim` (int): Dimensionality of the input embeddings. - - `output_dim` (int): Dimensionality of the output of the network. - - `num_hidden_layers` (int): Number of hidden layers in the network, set to 0 for a linear network. - - `dropout_rate` (float, optional): Dropout keep probability. - - `hidden_dim` (int): Size of the hidden layers. - - `activation` (str): Activation function to use ('tanh' or 'relu'). - - `use_layernormalize` (bool): Whether to use layer normalization. - - `skip_connection` (bool): Whether to use skip connections. - - `context_str` (str, optional): Contextual string for the encoder. - -#### `forward(input_tensor)` -Defines the forward pass of the network. - -- **Parameters**: - - `input_tensor` (Tensor): A tensor with shape `[batch_size, ..., input_dim]`. -- **Returns**: A tensor with shape `[batch_size, ..., output_dim]`. Note that no non-linearity is applied to the output. - -- **Raises**: - - `AssertionError`: If the last dimension of `input_tensor` does not match `input_dim`. - - - -## 1.2 PositionEncoder() -`PE(.)` is the most important component which distinguishes different `Enc(x)`. Usually, `PE(.)` is a *deterministic* function which transforms location x into a W-dimension vector, so-called position embedding. The purpose of `PE(.)` is to do location feature normalization (Chu et al. 2019, Mac Aodha et al. 2019, Rao et al. 2020) and/or feature decomposition (Mai et al. 2020b, Zhong et al. 2020) so that the output `PE(x)` is more learning-friendly for `NN(.)`. In Table 1 we further classify different `Enc(x)` into four sub-categories based on their `PE(.)`: discretization-based, direct, sinusoidal, and sinusoidal multi-scale location encoder. Each of them will be discussed in detail below. - -### 1.2.1 Properties -- `spa_embed_dim` (int): The dimension of the output spatial relation embedding. -- `coord_dim` (int): The dimensionality of space (e.g., 2 for 2D, 3 for 3D). -- `frequency_num` (int): The number of different frequencies/wavelengths for the sinusoidal functions. -- `max_radius` (float): The largest context radius the model can handle. -- `min_radius` (float): The smallest context radius considered by the model. -- `freq_init` (str): Method to initialize the frequency list ('random' or 'geometric'). -- `ffn` (nn.Module, optional): A feedforward neural network module to be applied to the embeddings. -- `device` (str): The device to which tensors will be moved ('cuda' or 'cpu'). - - -### 1.2.2 Methods -### `get_activation_function(activation, context_str)` -- **Parameters**: - - `activation`: A string that specifies the type of activation function to retrieve. - - `context_str`: A string that provides context for the error message if the activation function is not recognized. -- **Returns**: An activation function object from the `torch.nn` module. -- **Description**: Retrieves an activation function object based on the specified `activation` string. It supports 'leakyrelu', 'relu', 'sigmoid', and 'tanh'. If the specified activation is not recognized, it raises an exception with a context-specific error message. -- **Exceptions**: Raises an `Exception` with the message `"{context_str} activation not recognized."` if the specified activation function is not one of the supported options. - - -#### `cal_freq_list(freq_init, frequency_num, max_radius, min_radius)` -- **Parameters**: - - `freq_init`: A string that specifies the initialization method for frequencies ('random' or 'geometric'). - - `frequency_num`: An integer representing the number of frequencies to generate. - - `max_radius`: A float representing the maximum radius, used as the upper bound for random initialization or the geometric sequence's start point. - - `min_radius`: A float representing the minimum radius, used as the geometric sequence's end point. -- **Returns**: A NumPy array `freq_list` containing the list of frequencies initialized as per the method specified by `freq_init`. -- **Description**: Calculates a list of frequencies based on the initialization method specified. If `freq_init` is 'random', it generates `frequency_num` random frequencies, each multiplied by `max_radius`. If `freq_init` is 'geometric', it generates a list of frequencies based on a geometric progression from `min_radius` to `max_radius` with `frequency_num` elements. -- **Exceptions**: None explicitly raised, but if `frequency_num` is less than 1, it may cause an error in the geometric initialization logic. - - -#### `cal_freq_mat()` -Generates a matrix of frequencies for encoding. -- **Returns**: A frequency matrix (`np.array`) for use in positional encoding. - -#### `cal_input_dim()` -Computes the dimension of the encoded spatial relation embedding based on the frequency and coordinate dimensions. -- **Returns**: The input dimension (int) of the encoder. - -#### `cal_elementwise_angle(coord, cur_freq)` -Calculates the angle for each coordinate and frequency, to be used in the sinusoidal functions. -- **Parameters**: - - `coord`: The coordinate value (`deltaX` or `deltaY`). - - `cur_freq`: The current frequency being processed. -- **Returns**: The calculated angle (float). - -#### `cal_coord_embed(coords_tuple)` -Encodes a tuple of coordinates into a sinusoidal embedding. -- **Parameters**: - - `coords_tuple`: A tuple of coordinate values. -- **Returns**: A list of sinusoidal embeddings (`list`). - -#### `forward(coords)` -Abstract method for transforming spatial coordinates into embeddings. Must be implemented by subclasses. -- **Parameters**: - - `coords`: Spatial coordinates to encode. -- **Raises**: - - `NotImplementedError`: If the method is not overridden by a subclass. - -#### `visualize_embed_cosine` -Visualizes the cosine similarity of embeddings on a 2D plot. -- **Parameters**: - - `embed`: Embedding vector with shape `(spa_embed_dim, 1)`. - - `module`: The model module containing the embedding layers. - - `layername`: Specifies the layer name for which the embeddings are visualized (`"input_emb"` or `"output_emb"`). - - `coords`: Coordinates for the embeddings. - - `extent`: Extent of the plot area. - - `centerpt`: (Optional) The center point to highlight. - - `xy_list`: (Optional) List of points to plot. - - `pt_size`: (Optional) Size of the points. - - `polygon`: (Optional) Polygon to outline on the plot. - - `img_path`: (Optional) Path to save the plot image. - -#### `get_coords` -Generates a grid of coordinates within a specified extent. -- **Parameters**: - - `extent`: The bounding box for the coordinate grid. - - `interval`: The spacing between points in the grid. - -#### `map_id2geo` -Plots geographical locations based on their IDs. -- **Parameters**: - - `place2geo`: A mapping from place IDs to geographical coordinates. - -#### `visualize_encoder` -Visualizes the output of an encoder layer for a given set of coordinates. -- **Parameters**: - - `module`: The model module containing the encoder. - - `layername`: Specifies the encoder layer (`"input_emb"` or `"output_emb"`). - - `coords`: Coordinates for visualization. - - `extent`: Extent of the plot area. - - `num_ch`: Number of channels to visualize. - - `img_path`: (Optional) Path to save the visualization. - -#### `spa_enc_embed_clustering` -Performs spatial encoding embedding clustering and visualization. -- **Parameters**: - - `module`: The model module to use for forward pass. - - `num_cluster`: Number of clusters for the agglomerative clustering. - - `extent`: Extent of the plot area. - - `interval`: Interval between points in the grid. - - `coords`: Coordinates for clustering. - - `tsne_comp`: Number of components for t-SNE reduction. - -#### `make_enc_map` -Creates a map visualization based on encoder cluster labels. -- **Parameters**: - - `cluster_labels`: Cluster labels for each point in the grid. - - `num_cluster`: Number of clusters. - - `extent`: Extent of the plot area. - - `margin`: Margin around the plot area. - - `xy_list`: (Optional) List of points to plot. - - `polygon`: (Optional) Polygon to outline on the plot. - - `usa_gdf`: (Optional) GeoDataFrame for the USA map. - - `coords_color`: (Optional) Color for the coordinates. - - `colorbar`: (Optional) Flag to display a color bar. - - `img_path`: (Optional) Path to save the map image. - - `xlabel`, `ylabel`: (Optional) Labels for the x and y axes. - -#### `explode` -Converts a GeoDataFrame with MultiPolygons into a GeoDataFrame with Polygons. -- **Parameters**: - - `indata`: Input GeoDataFrame or file path. - -#### `get_pts_in_box` -Filters points within a specified bounding box. -- **Parameters**: - - `place2geo`: A mapping from place IDs to geographical coordinates. - - `extent`: The bounding box for filtering. - -#### `load_USA_geojson` -Loads and projects the USA mainland GeoJSON to the EPSG:2163 projection system. -- **Parameters**: - - `us_geojson_file`: Path to the USA GeoJSON file. - -#### `get_projected_mainland_USA_states` -Loads and projects mainland USA states from a GeoJSON file to the EPSG:2163 projection system. -- **Parameters**: - - `us_states_geojson_file`: Path to the USA states GeoJSON file. - -#### `read2idIndexFile` -Reads an entity or relation to ID mapping file. -- **Parameters**: - - `Index2idFilePath`: Path to the file containing the mappings. - -#### `reverse_dict` -Reverses a dictionary mapping. -- **Parameters**: - - `iri2id`: The dictionary to reverse. - -#### `get_node_mode` -Determines the mode (type) of a node based on the provided mappings. -- **Parameters**: - - `node_maps`: A mapping of node types to their IDs. - - `node_id`: The ID of the node to determine the mode for. - -#### `path_embedding_compute` -Computes the embedding for a path between nodes. -- **Parameters**: - - `path_dec`: The path decoder. - - ` - - - -# 2. Aggregation location encoder -

- Structure of Aggregation Location Encoder Structure -

diff --git a/documentation/figs/NeRF.png b/documentation/figs/NeRF.png deleted file mode 100644 index fc32c8f9..00000000 Binary files a/documentation/figs/NeRF.png and /dev/null differ diff --git a/documentation/figs/Sphere2Vec-sphereC.png b/documentation/figs/Sphere2Vec-sphereC.png deleted file mode 100644 index 8c192351..00000000 Binary files a/documentation/figs/Sphere2Vec-sphereC.png and /dev/null differ diff --git a/documentation/figs/aggregation_location_encoder_structure.png b/documentation/figs/aggregation_location_encoder_structure.png deleted file mode 100644 index 09157daa..00000000 Binary files a/documentation/figs/aggregation_location_encoder_structure.png and /dev/null differ diff --git a/documentation/figs/dfs.png b/documentation/figs/dfs.png deleted file mode 100644 index e0c4f0d2..00000000 Binary files a/documentation/figs/dfs.png and /dev/null differ diff --git a/documentation/figs/readme.md b/documentation/figs/readme.md deleted file mode 100644 index 8b137891..00000000 --- a/documentation/figs/readme.md +++ /dev/null @@ -1 +0,0 @@ - diff --git a/documentation/figs/single_location_encoder_structure.png b/documentation/figs/single_location_encoder_structure.png deleted file mode 100644 index 06895d21..00000000 Binary files a/documentation/figs/single_location_encoder_structure.png and /dev/null differ diff --git a/documentation/figs/sphereC+.png b/documentation/figs/sphereC+.png deleted file mode 100644 index 710ee4e4..00000000 Binary files a/documentation/figs/sphereC+.png and /dev/null differ diff --git a/documentation/figs/sphereM+.png b/documentation/figs/sphereM+.png deleted file mode 100644 index e429c10f..00000000 Binary files a/documentation/figs/sphereM+.png and /dev/null differ diff --git a/documentation/figs/sphereM.png b/documentation/figs/sphereM.png deleted file mode 100644 index 1d00e187..00000000 Binary files a/documentation/figs/sphereM.png and /dev/null differ diff --git a/documentation/figs/xyz.png b/documentation/figs/xyz.png deleted file mode 100644 index 8269f3a1..00000000 Binary files a/documentation/figs/xyz.png and /dev/null differ diff --git a/documentation/readme.md b/documentation/readme.md deleted file mode 100644 index 8b137891..00000000 --- a/documentation/readme.md +++ /dev/null @@ -1 +0,0 @@ - diff --git a/documentation/Encoders/readme.md b/eval_results/classification/eval_fmow__val_spherical_harmonics.csv similarity index 100% rename from documentation/Encoders/readme.md rename to eval_results/classification/eval_fmow__val_spherical_harmonics.csv diff --git a/figs/.DS_Store b/figs/.DS_Store new file mode 100644 index 00000000..5008ddfc Binary files /dev/null and b/figs/.DS_Store differ diff --git a/figs/TorchSpatial4_10regTasks.jpg b/figs/TorchSpatial4_10regTasks.jpg new file mode 100644 index 00000000..c6abdbd0 Binary files /dev/null and b/figs/TorchSpatial4_10regTasks.jpg differ diff --git a/main/.DS_Store b/main/.DS_Store index af48b195..74ca10e7 100644 Binary files a/main/.DS_Store and b/main/.DS_Store differ diff --git a/main/SpatialRelationEncoder.py b/main/SpatialRelationEncoder.py new file mode 100644 index 00000000..55bc0906 --- /dev/null +++ b/main/SpatialRelationEncoder.py @@ -0,0 +1,3112 @@ +import torch +import torch.nn as nn +from torch.nn import init +import torch.nn.functional as F + +import numpy as np +import math + +from module import * +from data_utils import * + +from spherical_harmonics_ylm_numpy import get_positional_encoding + +""" +A Set of position encoder +""" + + +def _cal_freq_list(freq_init, frequency_num, max_radius, min_radius): + if freq_init == "random": + # the frequence we use for each block, alpha in ICLR paper + # freq_list shape: (frequency_num) + freq_list = np.random.random(size=[frequency_num]) * max_radius + elif freq_init == "geometric": + # freq_list = [] + # for cur_freq in range(frequency_num): + # base = 1.0/(np.power(max_radius, cur_freq*1.0/(frequency_num-1))) + # freq_list.append(base) + + # freq_list = np.asarray(freq_list) + + log_timescale_increment = math.log(float(max_radius) / float(min_radius)) / ( + frequency_num * 1.0 - 1 + ) + + timescales = min_radius * np.exp( + np.arange(frequency_num).astype(float) * log_timescale_increment + ) + + freq_list = 1.0 / timescales + elif freq_init == "nerf": + """ + compute according to NeRF position encoding, + Equation 4 in https://arxiv.org/pdf/2003.08934.pdf + 2^{0}*pi, ..., 2^{L-1}*pi + """ + # + + freq_list = np.pi * np.exp2(np.arange(frequency_num).astype(float)) + + return freq_list + + +class GridCellSpatialRelationPositionEncoder(PositionEncoder): + """ + Given a list of(deltaX, deltaY), encode them using the position encoding function + """ + + def __init__( + self, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + device="cuda", + ): + """ + Args: + coord_dim: the dimention of space, 2D, 3D, or other + frequency_num: the number of different sinusoidal with different frequencies / wavelengths + max_radius: the largest context radius this model can handle + """ + super().__init__(coord_dim=coord_dim, device=device) + self.frequency_num = frequency_num + self.max_radius = max_radius + self.min_radius = min_radius + self.freq_init = freq_init + # the frequence we use for each block, alpha in ICLR paper + self.cal_freq_list() + self.cal_freq_mat() + + self.pos_enc_output_dim = self.cal_pos_enc_output_dim() + + def cal_elementwise_angle(self, coord, cur_freq): + """ + Args: + coord: the deltaX or deltaY + cur_freq: the frequency + """ + return coord / ( + np.power(self.max_radius, cur_freq * 1.0 / (self.frequency_num - 1)) + ) + + def cal_coord_embed(self, coords_tuple): + embed = [] + for coord in coords_tuple: + for cur_freq in range(self.frequency_num): + embed.append(math.sin(self.cal_elementwise_angle(coord, cur_freq))) + embed.append(math.cos(self.cal_elementwise_angle(coord, cur_freq))) + # embed: shape (pos_enc_output_dim) + return embed + + def cal_pos_enc_output_dim(self): + # compute the dimention of the encoded spatial relation embedding + return int(self.coord_dim * self.frequency_num * 2) + + def cal_freq_list(self): + self.freq_list = _cal_freq_list( + self.freq_init, self.frequency_num, self.max_radius, self.min_radius + ) + + def cal_freq_mat(self): + # freq_mat shape: (frequency_num, 1) + freq_mat = np.expand_dims(self.freq_list, axis=1) + # self.freq_mat shape: (frequency_num, 2) + self.freq_mat = np.repeat(freq_mat, 2, axis=1) + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: # noqa: E721 + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + # coords_mat: shape (batch_size, num_context_pt, 2, 1) + coords_mat = np.expand_dims(coords_mat, axis=3) + # coords_mat: shape (batch_size, num_context_pt, 2, 1, 1) + coords_mat = np.expand_dims(coords_mat, axis=4) + # coords_mat: shape (batch_size, num_context_pt, 2, frequency_num, 1) + coords_mat = np.repeat(coords_mat, self.frequency_num, axis=3) + # coords_mat: shape (batch_size, num_context_pt, 2, frequency_num, 2) + coords_mat = np.repeat(coords_mat, 2, axis=4) + # spr_embeds: shape (batch_size, num_context_pt, 2, frequency_num, 2) + spr_embeds = coords_mat * self.freq_mat + + # make sinuniod function + # sin for 2i, cos for 2i+1 + # spr_embeds: (batch_size, num_context_pt, 2*frequency_num*2=pos_enc_output_dim) + spr_embeds[:, :, :, :, 0::2] = np.sin(spr_embeds[:, :, :, :, 0::2]) # dim 2i + spr_embeds[:, :, :, :, 1::2] = np.cos(spr_embeds[:, :, :, :, 1::2]) # dim 2i+1 + + # (batch_size, num_context_pt, 2*frequency_num*2) + spr_embeds = np.reshape(spr_embeds, (batch_size, num_context_pt, -1)) + + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords(deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape(batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape(batch_size, num_context_pt, position_embed_dim) + """ + + spr_embeds = self.make_output_embeds(coords) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + # sprenc: shape (batch_size, num_context_pt, spa_embed_dim) + + return spr_embeds + + +class GridCellSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim=64, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + device="cuda", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="GridCellSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.frequency_num = frequency_num + self.max_radius = max_radius + self.min_radius = min_radius + self.freq_init = freq_init + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + + self.position_encoder = GridCellSpatialRelationPositionEncoder( + coord_dim=coord_dim, + frequency_num=frequency_num, + max_radius=max_radius, + min_radius=min_radius, + freq_init=freq_init, + device=device, + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + # input_dim=int(4 * frequency_num), + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + + +class GridCellNormSpatialRelationEncoder(nn.Module): + """ + Given a list of (deltaX,deltaY), encode them using the position encoding function + """ + + def __init__( + self, + spa_embed_dim, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + ffn=None, + device="cuda", + ): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + frequency_num: the number of different sinusoidal with different frequencies/wavelengths + max_radius: the largest context radius this model can handle + """ + super(GridCellNormSpatialRelationEncoder, self).__init__() + self.spa_embed_dim = spa_embed_dim + self.coord_dim = coord_dim + self.frequency_num = frequency_num + self.freq_init = freq_init + self.max_radius = max_radius + self.min_radius = min_radius + # the frequence we use for each block, alpha in ICLR paper + self.cal_freq_list() + self.cal_freq_mat() + + self.pos_enc_output_dim = self.cal_input_dim() + + self.ffn = ffn + self.device = device + + def cal_elementwise_angle(self, coord, cur_freq): + """ + Args: + coord: the deltaX or deltaY + cur_freq: the frequency + """ + return coord / ( + np.power(self.max_radius, cur_freq * 1.0 / (self.frequency_num - 1)) + ) + + def cal_coord_embed(self, coords_tuple): + embed = [] + for coord in coords_tuple: + for cur_freq in range(self.frequency_num): + embed.append(math.sin(self.cal_elementwise_angle(coord, cur_freq))) + embed.append(math.cos(self.cal_elementwise_angle(coord, cur_freq))) + # embed: shape (pos_enc_output_dim) + return embed + + def cal_input_dim(self): + # compute the dimention of the encoded spatial relation embedding + return int(self.coord_dim * self.frequency_num * 2) + + def cal_freq_list(self): + # if self.freq_init == "random": + # # the frequence we use for each block, alpha in ICLR paper + # # self.freq_list shape: (frequency_num) + # self.freq_list = np.random.random(size=[self.frequency_num]) * self.max_radius + # elif self.freq_init == "geometric": + # self.freq_list = [] + # for cur_freq in range(self.frequency_num): + # base = 1.0/(np.power(self.max_radius, cur_freq*1.0/(self.frequency_num-1))) + # self.freq_list.append(base) + + # self.freq_list = np.asarray(self.freq_list) + self.freq_list = _cal_freq_list( + self.freq_init, self.frequency_num, self.max_radius, self.min_radius + ) + + def cal_freq_mat(self): + # freq_mat shape: (frequency_num, 1) + freq_mat = np.expand_dims(self.freq_list, axis=1) + # self.freq_mat shape: (frequency_num, 2) + self.freq_mat = np.repeat(freq_mat, 2, axis=1) + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + # coords_mat: shape (batch_size, num_context_pt, 2, 1) + coords_mat = np.expand_dims(coords_mat, axis=3) + # coords_mat: shape (batch_size, num_context_pt, 2, 1, 1) + coords_mat = np.expand_dims(coords_mat, axis=4) + # coords_mat: shape (batch_size, num_context_pt, 2, frequency_num, 1) + coords_mat = np.repeat(coords_mat, self.frequency_num, axis=3) + # coords_mat: shape (batch_size, num_context_pt, 2, frequency_num, 2) + coords_mat = np.repeat(coords_mat, 2, axis=4) + # spr_embeds: shape (batch_size, num_context_pt, 2, frequency_num, 2) + spr_embeds = coords_mat * self.freq_mat + + # convert to radius + coords_mat = coords_mat * math.pi / 180 + + # make sinuniod function + # sin for 2i, cos for 2i+1 + # spr_embeds: (batch_size, num_context_pt, 2*frequency_num*2=pos_enc_output_dim) + spr_embeds[:, :, :, :, 0::2] = np.sin(spr_embeds[:, :, :, :, 0::2]) # dim 2i + spr_embeds[:, :, :, :, 1::2] = np.cos(spr_embeds[:, :, :, :, 1::2]) # dim 2i+1 + + # (batch_size, num_context_pt, 2*frequency_num*2) + spr_embeds = np.reshape(spr_embeds, (batch_size, num_context_pt, -1)) + + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + + spr_embeds = self.make_output_embeds(coords) + + # # loop over all batches + # spr_embeds = [] + # for cur_batch in coords: + # # loop over N context points + # cur_embeds = [] + # for coords_tuple in cur_batch: + # cur_embeds.append(self.cal_coord_embed(coords_tuple)) + # spr_embeds.append(cur_embeds) + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + # sprenc: shape (batch_size, num_context_pt, spa_embed_dim) + # sprenc = torch.einsum("bnd,dk->bnk", (spr_embeds, self.post_mat)) + # sprenc = self.f_act(self.dropout(self.post_linear(spr_embeds))) + + # return sprenc + if self.ffn is not None: + return self.ffn(spr_embeds) + else: + return spr_embeds + + +class HexagonGridCellSpatialRelationEncoder(nn.Module): + """ + Given a list of (deltaX,deltaY), encode them using the position encoding function + + """ + + def __init__( + self, + spa_embed_dim, + coord_dim=2, + frequency_num=16, + max_radius=10000, + dropout=0.5, + f_act="sigmoid", + device="cuda", + ): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + frequency_num: the number of different sinusoidal with different frequencies/wavelengths + max_radius: the largest context radius this model can handle + """ + super(HexagonGridCellSpatialRelationEncoder, self).__init__() + self.frequency_num = frequency_num + self.coord_dim = coord_dim + self.max_radius = max_radius + self.spa_embed_dim = spa_embed_dim + + self.pos_enc_output_dim = self.cal_input_dim() + + self.post_linear = nn.Linear(self.pos_enc_output_dim, self.spa_embed_dim) + nn.init.xavier_uniform(self.post_linear.weight) + self.dropout = nn.Dropout(p=dropout) + + # self.dropout_ = nn.Dropout(p=dropout) + + # self.post_mat = nn.Parameter(torch.FloatTensor(self.pos_enc_output_dim, self.spa_embed_dim)) + # init.xavier_uniform_(self.post_mat) + # self.register_parameter("spa_postmat", self.post_mat) + + self.f_act = get_activation_function( + f_act, "HexagonGridCellSpatialRelationEncoder" + ) + + self.device = device + + def cal_elementwise_angle(self, coord, cur_freq): + """ + Args: + coord: the deltaX or deltaY + cur_freq: the frequency + """ + return coord / ( + np.power(self.max_radius, cur_freq * 1.0 / (self.frequency_num - 1)) + ) + + def cal_coord_embed(self, coords_tuple): + embed = [] + for coord in coords_tuple: + for cur_freq in range(self.frequency_num): + embed.append(math.sin(self.cal_elementwise_angle(coord, cur_freq))) + embed.append( + math.sin( + self.cal_elementwise_angle(coord, cur_freq) + math.pi * 2.0 / 3 + ) + ) + embed.append( + math.sin( + self.cal_elementwise_angle(coord, cur_freq) + math.pi * 4.0 / 3 + ) + ) + # embed: shape (pos_enc_output_dim) + return embed + + def cal_input_dim(self): + # compute the dimention of the encoded spatial relation embedding + return int(self.coord_dim * self.frequency_num * 3) + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # loop over all batches + spr_embeds = [] + for cur_batch in coords: + # loop over N context points + cur_embeds = [] + for coords_tuple in cur_batch: + cur_embeds.append(self.cal_coord_embed(coords_tuple)) + spr_embeds.append(cur_embeds) + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + # sprenc: shape (batch_size, num_context_pt, spa_embed_dim) + # sprenc = torch.einsum("bnd,dk->bnk", (spr_embeds, self.post_mat)) + sprenc = self.f_act(self.dropout(self.post_linear(spr_embeds))) + + return sprenc + + +""" +The theory based Grid cell spatial relation encoder, +See https://openreview.net/forum?id=Syx0Mh05YQ +Learning Grid Cells as Vector Representation of Self-Position Coupled with Matrix Representation of Self-Motion +""" + + +class TheoryGridCellSpatialRelationPositionEncoder( + GridCellSpatialRelationPositionEncoder +): + """ + Given a list of (deltaX,deltaY), encode them using the position encoding function + """ + + def __init__( + self, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=1000, + freq_init="geometric", + device="cuda", + ): + """ + Args: + coord_dim: the dimention of space, 2D, 3D, or other + frequency_num: the number of different sinusoidal with different frequencies/wavelengths + max_radius: the largest context radius this model can handle + """ + super().__init__( + coord_dim=coord_dim, + frequency_num=frequency_num, + max_radius=max_radius, + min_radius=min_radius, + freq_init=freq_init, + device=device, + ) + # the frequence we use for each block, alpha in ICLR paper + self.cal_freq_list() + self.cal_freq_mat() + + # there unit vectors which is 120 degree apart from each other + self.unit_vec1 = np.asarray([1.0, 0.0]) # 0 + self.unit_vec2 = np.asarray([-1.0 / 2.0, math.sqrt(3) / 2.0]) # 120 degree + self.unit_vec3 = np.asarray([-1.0 / 2.0, -math.sqrt(3) / 2.0]) # 240 degree + + self.pos_enc_output_dim = self.cal_pos_enc_output_dim() + + def cal_freq_mat(self): + # freq_mat shape: (frequency_num, 1) + freq_mat = np.expand_dims(self.freq_list, axis=1) + # self.freq_mat shape: (frequency_num, 6) + self.freq_mat = np.repeat(freq_mat, 6, axis=1) + + def cal_pos_enc_output_dim(self): + # compute the dimention of the encoded spatial relation embedding + return int(6 * self.frequency_num) + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # (batch_size, num_context_pt, coord_dim) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + + # compute the dot product between [deltaX, deltaY] and each unit_vec + # (batch_size, num_context_pt, 1) + angle_mat1 = np.expand_dims(np.matmul(coords_mat, self.unit_vec1), axis=-1) + # (batch_size, num_context_pt, 1) + angle_mat2 = np.expand_dims(np.matmul(coords_mat, self.unit_vec2), axis=-1) + # (batch_size, num_context_pt, 1) + angle_mat3 = np.expand_dims(np.matmul(coords_mat, self.unit_vec3), axis=-1) + + # (batch_size, num_context_pt, 6) + angle_mat = np.concatenate( + [angle_mat1, angle_mat1, angle_mat2, angle_mat2, angle_mat3, angle_mat3], + axis=-1, + ) + # (batch_size, num_context_pt, 1, 6) + angle_mat = np.expand_dims(angle_mat, axis=-2) + # (batch_size, num_context_pt, frequency_num, 6) + angle_mat = np.repeat(angle_mat, self.frequency_num, axis=-2) + # (batch_size, num_context_pt, frequency_num, 6) + angle_mat = angle_mat * self.freq_mat + # (batch_size, num_context_pt, frequency_num*6) + spr_embeds = np.reshape(angle_mat, (batch_size, num_context_pt, -1)) + + # make sinuniod function + # sin for 2i, cos for 2i+1 + # spr_embeds: (batch_size, num_context_pt, frequency_num*6=pos_enc_output_dim) + spr_embeds[:, :, 0::2] = np.sin(spr_embeds[:, :, 0::2]) # dim 2i + spr_embeds[:, :, 1::2] = np.cos(spr_embeds[:, :, 1::2]) # dim 2i+1 + + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + spr_embeds = self.make_output_embeds(coords) + + # spr_embeds: (batch_size, num_context_pt, pos_enc_output_dim) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + return spr_embeds + + +class TheoryGridCellSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + device="cuda", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="TheoryGridCellSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.frequency_num = frequency_num + self.max_radius = max_radius + self.min_radius = min_radius + self.freq_init = freq_init + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + + self.position_encoder = TheoryGridCellSpatialRelationPositionEncoder( + coord_dim=coord_dim, + frequency_num=frequency_num, + max_radius=max_radius, + min_radius=min_radius, + freq_init=freq_init, + device=device, + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + + +""" +The theory based Grid cell spatial relation encoder, +See https://openreview.net/forum?id=Syx0Mh05YQ +Learning Grid Cells as Vector Representation of Self-Position Coupled with Matrix Representation of Self-Motion +We retrict the linear layer is block diagonal +""" + + +class TheoryDiagGridCellSpatialRelationEncoder(nn.Module): + """ + Given a list of (deltaX,deltaY), encode them using the position encoding function + + """ + + def __init__( + self, + spa_embed_dim, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + dropout=0.5, + f_act="sigmoid", + freq_init="geometric", + use_layn=False, + use_post_mat=False, + device="cuda", + ): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + frequency_num: the number of different sinusoidal with different frequencies/wavelengths + max_radius: the largest context radius this model can handle + """ + super(TheoryDiagGridCellSpatialRelationEncoder, self).__init__() + self.frequency_num = frequency_num + self.coord_dim = coord_dim + self.max_radius = max_radius + self.min_radius = min_radius + self.spa_embed_dim = spa_embed_dim + self.freq_init = freq_init + + self.device = device + + # the frequence we use for each block, alpha in ICLR paper + self.cal_freq_list() + self.cal_freq_mat() + + # there unit vectors which is 120 degree apart from each other + self.unit_vec1 = np.asarray([1.0, 0.0]) # 0 + self.unit_vec2 = np.asarray([-1.0 / 2.0, math.sqrt(3) / 2.0]) # 120 degree + self.unit_vec3 = np.asarray([-1.0 / 2.0, -math.sqrt(3) / 2.0]) # 240 degree + + self.pos_enc_output_dim = self.cal_input_dim() + + assert self.spa_embed_dim % self.frequency_num == 0 + + # self.post_linear = nn.Linear(self.frequency_num, 6, self.spa_embed_dim//self.frequency_num) + + # a block diagnal matrix + self.post_mat = nn.Parameter( + torch.FloatTensor( + self.frequency_num, 6, self.spa_embed_dim // self.frequency_num + ).to(device) + ) + init.xavier_uniform_(self.post_mat) + self.register_parameter("spa_postmat", self.post_mat) + self.dropout = nn.Dropout(p=dropout) + + self.use_post_mat = use_post_mat + if self.use_post_mat: + self.post_linear = nn.Linear(self.spa_embed_dim, self.spa_embed_dim) + self.dropout_ = nn.Dropout(p=dropout) + + self.f_act = get_activation_function( + f_act, "TheoryDiagGridCellSpatialRelationEncoder" + ) + + def cal_freq_list(self): + # if self.freq_init == "random": + # # the frequence we use for each block, alpha in ICLR paper + # # self.freq_list shape: (frequency_num) + # self.freq_list = np.random.random(size=[self.frequency_num]) * self.max_radius + # elif self.freq_init == "geometric": + # self.freq_list = [] + # for cur_freq in range(self.frequency_num): + # base = 1.0/(np.power(self.max_radius, cur_freq*1.0/(self.frequency_num-1))) + # self.freq_list.append(base) + + # self.freq_list = np.asarray(self.freq_list) + self.freq_list = _cal_freq_list( + self.freq_init, self.frequency_num, self.max_radius, self.min_radius + ) + + def cal_freq_mat(self): + # freq_mat shape: (frequency_num, 1) + freq_mat = np.expand_dims(self.freq_list, axis=1) + # self.freq_mat shape: (frequency_num, 6) + self.freq_mat = np.repeat(freq_mat, 6, axis=1) + + def cal_input_dim(self): + # compute the dimention of the encoded spatial relation embedding + return int(6 * self.frequency_num) + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # (batch_size, num_context_pt, coord_dim) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + + # compute the dot product between [deltaX, deltaY] and each unit_vec + # (batch_size, num_context_pt, 1) + angle_mat1 = np.expand_dims(np.matmul(coords_mat, self.unit_vec1), axis=-1) + # (batch_size, num_context_pt, 1) + angle_mat2 = np.expand_dims(np.matmul(coords_mat, self.unit_vec2), axis=-1) + # (batch_size, num_context_pt, 1) + angle_mat3 = np.expand_dims(np.matmul(coords_mat, self.unit_vec3), axis=-1) + + # (batch_size, num_context_pt, 6) + angle_mat = np.concatenate( + [angle_mat1, angle_mat1, angle_mat2, angle_mat2, angle_mat3, angle_mat3], + axis=-1, + ) + # (batch_size, num_context_pt, 1, 6) + angle_mat = np.expand_dims(angle_mat, axis=-2) + # (batch_size, num_context_pt, frequency_num, 6) + angle_mat = np.repeat(angle_mat, self.frequency_num, axis=-2) + # (batch_size, num_context_pt, frequency_num, 6) + spr_embeds = angle_mat * self.freq_mat + + # make sinuniod function + # sin for 2i, cos for 2i+1 + # spr_embeds: (batch_size, num_context_pt, frequency_num, 6) + spr_embeds[:, :, :, 0::2] = np.sin(spr_embeds[:, :, :, 0::2]) # dim 2i + spr_embeds[:, :, :, 1::2] = np.cos(spr_embeds[:, :, :, 1::2]) # dim 2i+1 + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + # (batch_size, num_context_pt, coord_dim) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + + spr_embeds = self.make_output_embeds(coords) + + # spr_embeds: (batch_size, num_context_pt, frequency_num, 6) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + # sprenc: shape (batch_size, num_context_pt, frequency_num, spa_embed_dim//frequency_num) + sprenc = torch.einsum("bnfs,fsd->bnfd", (spr_embeds, self.post_mat)) + # sprenc: shape (batch_size, num_context_pt, spa_embed_dim) + sprenc = sprenc.contiguous().view( + batch_size, num_context_pt, self.spa_embed_dim + ) + if self.use_post_mat: + sprenc = self.dropout(sprenc) + sprenc = self.f_act(self.dropout_(self.post_linear(sprenc))) + else: + # print(sprenc.size()) + sprenc = self.f_act(self.dropout(sprenc)) + + return sprenc + + +class NaiveSpatialRelationEncoder(nn.Module): + """ + Given a list of (deltaX,deltaY), encode them using the position encoding function + + """ + + def __init__(self, spa_embed_dim, extent, coord_dim=2, ffn=None, device="cuda"): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + extent: (x_min, x_max, y_min, y_max) + """ + super(NaiveSpatialRelationEncoder, self).__init__() + self.spa_embed_dim = spa_embed_dim + self.coord_dim = coord_dim + self.extent = extent + + # self.post_linear = nn.Linear(self.coord_dim, self.spa_embed_dim) + # nn.init.xavier_uniform(self.post_linear.weight) + # self.dropout = nn.Dropout(p=dropout) + + # self.f_act = get_activation_function(f_act, "NaiveSpatialRelationEncoder") + self.ffn = ffn + + self.device = device + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + coords_mat = coord_normalize(coords, self.extent) + + # spr_embeds: shape (batch_size, num_context_pt, coord_dim) + spr_embeds = torch.FloatTensor(coords_mat).to(self.device) + + # sprenc: shape (batch_size, num_context_pt, spa_embed_dim) + # sprenc = torch.einsum("bnd,dk->bnk", (spr_embeds, self.post_mat)) + + # sprenc = self.f_act(self.dropout(self.post_linear(spr_embeds))) + if self.ffn is not None: + return self.ffn(spr_embeds) + else: + return spr_embeds + + # return sprenc + + +class XYZSpatialRelationPositionEncoder(PositionEncoder): + """ + Given a list of (lon,lat), convert them to (x,y,z), and then encode them using the MLP + + """ + + def __init__(self, coord_dim=2, device="cuda"): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + extent: (x_min, x_max, y_min, y_max) + """ + super().__init__(coord_dim=coord_dim, device=device) + + self.pos_enc_output_dim = 3 # self.cal_pos_enc_output_dim() + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # (batch_size, num_context_pt, coord_dim) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + + # lon: (batch_size, num_context_pt, 1), convert from degree to radius + lon = np.deg2rad(coords_mat[:, :, :1]) + # lat: (batch_size, num_context_pt, 1), convert from degree to radius + lat = np.deg2rad(coords_mat[:, :, 1:]) + + # convert (lon, lat) to (x,y,z), assume in unit sphere with r = 1 + x = np.cos(lat) * np.cos(lon) + y = np.cos(lat) * np.sin(lon) + z = np.sin(lat) + + # spr_embeds: (batch_size, num_context_pt, 3) + spr_embeds = np.concatenate((x, y, z), axis=-1) + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + # (batch_size, num_context_pt, coord_dim) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + + # spr_embeds: (batch_size, num_context_pt, 3) + spr_embeds = self.make_output_embeds(coords) + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim = 3) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + return spr_embeds + + +class XYZSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + coord_dim=2, + device="cuda", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="XYZSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + self.ffn_context_str = ffn_context_str + + self.position_encoder = XYZSpatialRelationPositionEncoder( + coord_dim=coord_dim, device=device + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + + +class NERFSpatialRelationPositionEncoder(PositionEncoder): + """ + Given a list of (lon,lat), convert them to (x,y,z), and then encode them using the MLP + + """ + + def __init__(self, coord_dim=2, frequency_num=16, freq_init="nerf", device="cuda"): + """ + Args: + coord_dim: the dimention of space, 2D, 3D, or other + """ + super().__init__(coord_dim=coord_dim, device=device) + self.coord_dim = coord_dim + self.frequency_num = frequency_num + self.freq_init = freq_init + + self.cal_freq_list() + self.cal_freq_mat() + + self.pos_enc_output_dim = 6 * frequency_num + + def cal_freq_list(self): + """ + compute according to NeRF position encoding, + Equation 4 in https://arxiv.org/pdf/2003.08934.pdf + 2^{0}*pi, ..., 2^{L-1}*pi + """ + # freq_list: shape (frequency_num) + self.freq_list = _cal_freq_list(self.freq_init, self.frequency_num, None, None) + + def cal_freq_mat(self): + # freq_mat shape: (1, frequency_num) + freq_mat = np.expand_dims(self.freq_list, axis=0) + # self.freq_mat shape: (3, frequency_num) + self.freq_mat = np.repeat(freq_mat, 3, axis=0) + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # (batch_size, num_context_pt, coord_dim) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + + # lon: (batch_size, num_context_pt, 1), convert from degree to radius + lon = np.deg2rad(coords_mat[:, :, :1]) + # lat: (batch_size, num_context_pt, 1), convert from degree to radius + lat = np.deg2rad(coords_mat[:, :, 1:]) + + # convert (lon, lat) to (x,y,z), assume in unit sphere with r = 1 + x = np.cos(lat) * np.cos(lon) + y = np.cos(lat) * np.sin(lon) + z = np.sin(lat) + + # coords_mat: (batch_size, num_context_pt, 3) + coords_mat = np.concatenate((x, y, z), axis=-1) + # coords_mat: (batch_size, num_context_pt, 3, 1) + coords_mat = np.expand_dims(coords_mat, axis=-1) + # coords_mat: (batch_size, num_context_pt, 3, frequency_num) + coords_mat = np.repeat(coords_mat, self.frequency_num, axis=-1) + # coords_mat: (batch_size, num_context_pt, 3, frequency_num) + coords_mat = coords_mat * self.freq_mat + + # coords_mat: (batch_size, num_context_pt, 6, frequency_num) + spr_embeds = np.concatenate([np.sin(coords_mat), np.cos(coords_mat)], axis=2) + + # spr_embeds: (batch_size, num_context_pt, 6*frequency_num) + spr_embeds = np.reshape(spr_embeds, (batch_size, num_context_pt, -1)) + + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + # (batch_size, num_context_pt, coord_dim) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + + # spr_embeds: (batch_size, num_context_pt, 3) + spr_embeds = self.make_output_embeds(coords) + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim = 6*frequency_num) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + return spr_embeds + + +class NERFSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + coord_dim=2, + device="cuda", + frequency_num=16, + freq_init="nerf", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="NERFSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.frequency_num = frequency_num + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + + self.position_encoder = NERFSpatialRelationPositionEncoder( + coord_dim=coord_dim, frequency_num=frequency_num, device=device + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + + +class RBFSpatialRelationPositionEncoder(PositionEncoder): + """ + Given a list of (X,Y), compute the distance from each pt to each RBF anchor points + Feed into a MLP + + This is for global position encoding or relative/spatial context position encoding + """ + + def __init__( + self, + model_type, + train_locs, + coord_dim=2, + num_rbf_anchor_pts=100, + rbf_kernel_size=10e2, + rbf_kernel_size_ratio=0.0, + max_radius=10000, + rbf_anchor_pt_ids=None, + device="cuda", + ): + """ + Args: + train_locs: np.arrary, [batch_size, 2], location data + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + num_rbf_anchor_pts: the number of RBF anchor points + rbf_kernel_size: the RBF kernel size + The sigma in https://en.wikipedia.org/wiki/Radial_basis_function_kernel + rbf_kernel_size_ratio: if not None, (only applied on relative model) + different anchor pts have different kernel size : + dist(anchot_pt, origin) * rbf_kernel_size_ratio + rbf_kernel_size + max_radius: the relative spatial context size in spatial context model + """ + super().__init__(coord_dim=coord_dim, device=device) + self.model_type = model_type + self.train_locs = train_locs + self.num_rbf_anchor_pts = num_rbf_anchor_pts + self.rbf_kernel_size = rbf_kernel_size + self.rbf_kernel_size_ratio = rbf_kernel_size_ratio + self.max_radius = max_radius + self.rbf_anchor_pt_ids = rbf_anchor_pt_ids + + # calculate the coordinate matrix for each RBF anchor points + self.cal_rbf_anchor_coord_mat() + + self.pos_enc_output_dim = self.num_rbf_anchor_pts + + def _random_sampling(self, item_tuple, num_sample): + """ + poi_type_tuple: (Type1, Type2,...TypeM) + """ + + type_list = list(item_tuple) + if len(type_list) > num_sample: + return list(np.random.choice(type_list, num_sample, replace=False)) + elif len(type_list) == num_sample: + return item_tuple + else: + return list(np.random.choice(type_list, num_sample, replace=True)) + + def cal_rbf_anchor_coord_mat(self): + if self.model_type == "global": + assert self.rbf_kernel_size_ratio == 0 + # If we do RBF on location/global model, + # we need to random sample M RBF anchor points from training point dataset + if self.rbf_anchor_pt_ids == None: + self.rbf_anchor_pt_ids = self._random_sampling( + np.arange(len(self.train_locs)), self.num_rbf_anchor_pts + ) + + self.rbf_coords_mat = self.train_locs[self.rbf_anchor_pt_ids] + + elif self.model_type == "relative": + # If we do RBF on spatial context/relative model, + # We just ra ndom sample M-1 RBF anchor point in the relative spatial context defined by max_radius + # The (0,0) is also an anchor point + x_list = np.random.uniform( + -self.max_radius, self.max_radius, self.num_rbf_anchor_pts + ) + x_list[0] = 0.0 + y_list = np.random.uniform( + -self.max_radius, self.max_radius, self.num_rbf_anchor_pts + ) + y_list[0] = 0.0 + # self.rbf_coords: (num_rbf_anchor_pts, 2) + self.rbf_coords_mat = np.transpose(np.stack([x_list, y_list], axis=0)) + + if self.rbf_kernel_size_ratio > 0: + dist_mat = np.sqrt(np.sum(np.power(self.rbf_coords_mat, 2), axis=-1)) + # rbf_kernel_size_mat: (num_rbf_anchor_pts) + self.rbf_kernel_size_mat = ( + dist_mat * self.rbf_kernel_size_ratio + self.rbf_kernel_size + ) + + def make_output_embeds(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt=1, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, pos_enc_output_dim) + """ + if type(coords) == np.ndarray: + # print("np.shape(coords)",np.shape(coords)) + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception("Unknown coords data type for RBFSpatialRelationEncoder") + + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + # coords_mat: shape (batch_size, num_context_pt, 1, 2) + coords_mat = np.expand_dims(coords_mat, axis=2) + # coords_mat: shape (batch_size, num_context_pt, num_rbf_anchor_pts, 2) + coords_mat = np.repeat(coords_mat, self.num_rbf_anchor_pts, axis=2) + # compute (deltaX, deltaY) between each point and each RBF anchor points + # coords_mat: shape (batch_size, num_context_pt, num_rbf_anchor_pts, 2) + coords_mat = coords_mat - self.rbf_coords_mat + # coords_mat: shape (batch_size, num_context_pt, num_rbf_anchor_pts=pos_enc_output_dim) + coords_mat = np.sum(np.power(coords_mat, 2), axis=3) + if self.rbf_kernel_size_ratio > 0: + spr_embeds = np.exp( + (-1 * coords_mat) / (2.0 * np.power(self.rbf_kernel_size_mat, 2)) + ) + else: + # spr_embeds: shape (batch_size, num_context_pt, num_rbf_anchor_pts=pos_enc_output_dim) + spr_embeds = np.exp( + (-1 * coords_mat) / (2.0 * np.power(self.rbf_kernel_size, 2)) + ) + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt=1, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + spr_embeds = self.make_output_embeds(coords) + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + return spr_embeds + + +class RBFSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + train_locs, + model_type, + coord_dim=2, + device="cuda", + num_rbf_anchor_pts=100, + rbf_kernel_size=10e2, + rbf_kernel_size_ratio=0.0, + max_radius=10000, + rbf_anchor_pt_ids=None, + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="RBFSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.train_locs = train_locs + self.model_type = model_type + self.num_rbf_anchor_pts = num_rbf_anchor_pts + self.rbf_kernel_size = rbf_kernel_size + self.rbf_kernel_size_ratio = rbf_kernel_size_ratio + self.max_radius = max_radius + self.rbf_anchor_pt_ids = rbf_anchor_pt_ids + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + self.ffn_context_str = ffn_context_str + + self.position_encoder = RBFSpatialRelationPositionEncoder( + model_type=model_type, + train_locs=train_locs, + coord_dim=coord_dim, + num_rbf_anchor_pts=num_rbf_anchor_pts, + rbf_kernel_size=rbf_kernel_size, + rbf_kernel_size_ratio=rbf_kernel_size_ratio, + max_radius=max_radius, + rbf_anchor_pt_ids=rbf_anchor_pt_ids, + device=device, + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=self.ffn_skip_connection, + context_str=self.ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + + +class SphereSpatialRelationPositionEncoder(PositionEncoder): + """ + Given a list of (deltaX,deltaY), encode them using the position encoding function + + """ + + def __init__( + self, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + device="cuda", + ): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + frequency_num: the number of different sinusoidal with different frequencies/wavelengths + max_radius: the largest context radius this model can handle + """ + super().__init__(coord_dim=coord_dim, device=device) + self.coord_dim = coord_dim + self.frequency_num = frequency_num + self.freq_init = freq_init + self.max_radius = max_radius + self.min_radius = min_radius + # the frequence we use for each block, alpha in ICLR paper + self.cal_freq_list() + self.cal_freq_mat() + + self.pos_enc_output_dim = self.cal_pos_enc_output_dim() + + def cal_elementwise_angle(self, coord, cur_freq): + """ + Args: + coord: the deltaX or deltaY + cur_freq: the frequency + """ + return coord / ( + np.power(self.max_radius, cur_freq * 1.0 / (self.frequency_num - 1)) + ) + + def cal_coord_embed(self, coords_tuple): + embed = [] + for coord in coords_tuple: + for cur_freq in range(self.frequency_num): + embed.append(math.sin(self.cal_elementwise_angle(coord, cur_freq))) + embed.append(math.cos(self.cal_elementwise_angle(coord, cur_freq))) + # embed: shape (pos_enc_output_dim) + return embed + + def cal_pos_enc_output_dim(self): + # compute the dimention of the encoded spatial relation embedding + return int( + 3 * self.frequency_num + ) # int(self.coord_dim * self.frequency_num * 2) + + def cal_freq_list(self): + self.freq_list = _cal_freq_list( + self.freq_init, self.frequency_num, self.max_radius, self.min_radius + ) + + def cal_freq_mat(self): + # freq_mat shape: (frequency_num, 1) + freq_mat = np.expand_dims(self.freq_list, axis=1) + # self.freq_mat shape: (frequency_num, 1) + self.freq_mat = freq_mat + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + # coords_mat: shape (batch_size, num_context_pt, 2, 1) + coords_mat = np.expand_dims(coords_mat, axis=3) + # coords_mat: shape (batch_size, num_context_pt, 2, 1, 1) + coords_mat = np.expand_dims(coords_mat, axis=4) + # coords_mat: shape (batch_size, num_context_pt, 2, frequency_num, 1) + coords_mat = np.repeat(coords_mat, self.frequency_num, axis=3) + + # spr_embeds: shape (batch_size, num_context_pt, 2, frequency_num, 1) + spr_embeds = coords_mat * self.freq_mat + + # convert to radius + spr_embeds = spr_embeds * math.pi / 180 + + # lon, lat: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon = np.expand_dims(spr_embeds[:, :, 0, :, :], axis=2) + lat = np.expand_dims(spr_embeds[:, :, 1, :, :], axis=2) + + # make sinuniod function + # lon_sin, lon_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon_sin = np.sin(lon) + lon_cos = np.cos(lon) + + # lat_sin, lat_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lat_sin = np.sin(lat) + lat_cos = np.cos(lat) + + # spr_embeds_: shape (batch_size, num_context_pt, 1, frequency_num, 3) + spr_embeds_ = np.concatenate( + [lat_sin, lat_cos * lon_cos, lat_cos * lon_sin], axis=-1 + ) + + # # make sinuniod function + # # sin for 2i, cos for 2i+1 + # # spr_embeds: (batch_size, num_context_pt, 2*frequency_num*2=pos_enc_output_dim) + # spr_embeds[:, :, :, :, 0::2] = np.sin(spr_embeds[:, :, :, :, 0::2]) # dim 2i + # spr_embeds[:, :, :, :, 1::2] = np.cos(spr_embeds[:, :, :, :, 1::2]) # dim 2i+1 + + # (batch_size, num_context_pt, frequency_num*3) + spr_embeds = np.reshape(spr_embeds_, (batch_size, num_context_pt, -1)) + + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + + spr_embeds = self.make_output_embeds(coords) + + # # loop over all batches + # spr_embeds = [] + # for cur_batch in coords: + # # loop over N context points + # cur_embeds = [] + # for coords_tuple in cur_batch: + # cur_embeds.append(self.cal_coord_embed(coords_tuple)) + # spr_embeds.append(cur_embeds) + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + # sprenc: shape (batch_size, num_context_pt, spa_embed_dim) + # sprenc = torch.einsum("bnd,dk->bnk", (spr_embeds, self.post_mat)) + # sprenc = self.f_act(self.dropout(self.post_linear(spr_embeds))) + + return spr_embeds + + +class SphereSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + device="cuda", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="SphereSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.frequency_num = frequency_num + self.max_radius = max_radius + self.min_radius = min_radius + self.freq_init = freq_init + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + + self.position_encoder = SphereSpatialRelationPositionEncoder( + coord_dim=coord_dim, + frequency_num=frequency_num, + max_radius=max_radius, + min_radius=min_radius, + freq_init=freq_init, + device=device, + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + + +class SphereGirdSpatialRelationPositionEncoder(PositionEncoder): + """ + Given a list of (deltaX,deltaY), enco + de them using the position encoding function + + """ + + def __init__( + self, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + device="cuda", + ): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + frequency_num: the number of different sinusoidal with different frequencies/wavelengths + max_radius: the largest context radius this model can handle + """ + super().__init__(coord_dim=coord_dim, device=device) + self.frequency_num = frequency_num + self.freq_init = freq_init + self.max_radius = max_radius + self.min_radius = min_radius + # the frequence we use for each block, alpha in ICLR paper + self.cal_freq_list() + self.cal_freq_mat() + + self.pos_enc_output_dim = self.cal_pos_enc_output_dim() + + def cal_elementwise_angle(self, coord, cur_freq): + """ + Args: + coord: the deltaX or deltaY + cur_freq: the frequency + """ + return coord / ( + np.power(self.max_radius, cur_freq * 1.0 / (self.frequency_num - 1)) + ) + + def cal_coord_embed(self, coords_tuple): + embed = [] + for coord in coords_tuple: + for cur_freq in range(self.frequency_num): + embed.append(math.sin(self.cal_elementwise_angle(coord, cur_freq))) + embed.append(math.cos(self.cal_elementwise_angle(coord, cur_freq))) + # embed: shape (pos_enc_output_dim) + return embed + + def cal_pos_enc_output_dim(self): + # compute the dimention of the encoded spatial relation embedding + return int( + 6 * self.frequency_num + ) # int(self.coord_dim * self.frequency_num * 2) + + def cal_freq_list(self): + self.freq_list = _cal_freq_list( + self.freq_init, self.frequency_num, self.max_radius, self.min_radius + ) + + def cal_freq_mat(self): + # freq_mat shape: (frequency_num, 1) + freq_mat = np.expand_dims(self.freq_list, axis=1) + # self.freq_mat shape: (frequency_num, 1) + self.freq_mat = freq_mat + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + # coords_mat: shape (batch_size, num_context_pt, 2, 1) + coords_mat = np.expand_dims(coords_mat, axis=3) + # coords_mat: shape (batch_size, num_context_pt, 2, 1, 1) + coords_mat = np.expand_dims(coords_mat, axis=4) + # coords_mat: shape (batch_size, num_context_pt, 2, frequency_num, 1) + coords_mat = np.repeat(coords_mat, self.frequency_num, axis=3) + + # spr_embeds: shape (batch_size, num_context_pt, 2, frequency_num, 1) + spr_embeds = coords_mat * self.freq_mat + + # convert to radius + spr_embeds = spr_embeds * math.pi / 180 + + # lon, lat: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon = np.expand_dims(spr_embeds[:, :, 0, :, :], axis=2) + lat = np.expand_dims(spr_embeds[:, :, 1, :, :], axis=2) + + # make sinuniod function + # lon_sin, lon_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon_sin = np.sin(lon) + lon_cos = np.cos(lon) + + # lat_sin, lat_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lat_sin = np.sin(lat) + lat_cos = np.cos(lat) + + # spr_embeds_: shape (batch_size, num_context_pt, 1, frequency_num, 6) + spr_embeds_ = np.concatenate( + [lat_sin, lat_cos, lon_sin, lon_cos, lat_cos * lon_cos, lat_cos * lon_sin], + axis=-1, + ) + + # (batch_size, num_context_pt, 2*frequency_num*6) + spr_embeds = np.reshape(spr_embeds_, (batch_size, num_context_pt, -1)) + + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + + spr_embeds = self.make_output_embeds(coords) + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + # sprenc: shape (batch_size, num_context_pt, spa_embed_dim) + # sprenc = torch.einsum("bnd,dk->bnk", (spr_embeds, self.post_mat)) + # sprenc = self.f_act(self.dropout(self.post_linear(spr_embeds))) + + return spr_embeds + + +class SphereGirdSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + device="cuda", + freq_init="geometric", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="SphereGirdSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.frequency_num = frequency_num + self.max_radius = max_radius + self.min_radius = min_radius + self.freq_init = freq_init + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + + self.position_encoder = SphereGirdSpatialRelationPositionEncoder( + coord_dim=coord_dim, + frequency_num=frequency_num, + max_radius=max_radius, + min_radius=min_radius, + freq_init=freq_init, + device=device, + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + + +class SphereMixScaleSpatialRelationPositionEncoder(PositionEncoder): + """ + Given a list of (deltaX,deltaY), encode them using the position encoding function + + """ + + def __init__( + self, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + device="cuda", + ): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + frequency_num: the number of different sinusoidal with different frequencies/wavelengths + max_radius: the largest context radius this model can handle + """ + super().__init__(coord_dim=coord_dim, device=device) + self.coord_dim = coord_dim + self.frequency_num = frequency_num + self.freq_init = freq_init + self.max_radius = max_radius + self.min_radius = min_radius + # the frequence we use for each block, alpha in ICLR paper + self.cal_freq_list() + self.cal_freq_mat() + + self.pos_enc_output_dim = self.cal_pos_enc_output_dim() + + def cal_elementwise_angle(self, coord, cur_freq): + """ + Args: + coord: the deltaX or deltaY + cur_freq: the frequency + """ + return coord / ( + np.power(self.max_radius, cur_freq * 1.0 / (self.frequency_num - 1)) + ) + + def cal_coord_embed(self, coords_tuple): + embed = [] + for coord in coords_tuple: + for cur_freq in range(self.frequency_num): + embed.append(math.sin(self.cal_elementwise_angle(coord, cur_freq))) + embed.append(math.cos(self.cal_elementwise_angle(coord, cur_freq))) + # embed: shape (pos_enc_output_dim) + return embed + + def cal_pos_enc_output_dim(self): + # compute the dimention of the encoded spatial relation embedding + return int( + 5 * self.frequency_num + ) # int(self.coord_dim * self.frequency_num * 2) + + def cal_freq_list(self): + self.freq_list = _cal_freq_list( + self.freq_init, self.frequency_num, self.max_radius, self.min_radius + ) + + def cal_freq_mat(self): + # freq_mat shape: (frequency_num, 1) + freq_mat = np.expand_dims(self.freq_list, axis=1) + # self.freq_mat shape: (frequency_num, 1) + self.freq_mat = freq_mat + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + # coords_mat: shape (batch_size, num_context_pt, 2, 1) + coords_mat = np.expand_dims(coords_mat, axis=3) + # coords_mat: shape (batch_size, num_context_pt, 2, 1, 1) + coords_mat = np.expand_dims(coords_mat, axis=4) + # coords_mat: shape (batch_size, num_context_pt, 2, frequency_num, 1) + coords_mat = np.repeat(coords_mat, self.frequency_num, axis=3) + + # convert to radius + coords_mat = coords_mat * math.pi / 180 + + # lon, lat: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon_single = np.expand_dims(coords_mat[:, :, 0, :, :], axis=2) + lat_single = np.expand_dims(coords_mat[:, :, 1, :, :], axis=2) + + # make sinuniod function + # lon_sin, lon_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon_single_sin = np.sin(lon_single) + lon_single_cos = np.cos(lon_single) + + # lat_sin, lat_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lat_single_sin = np.sin(lat_single) + lat_single_cos = np.cos(lat_single) + + # spr_embeds: shape (batch_size, num_context_pt, 2, frequency_num, 1) + spr_embeds = coords_mat * self.freq_mat + + # lon, lat: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon = np.expand_dims(spr_embeds[:, :, 0, :, :], axis=2) + lat = np.expand_dims(spr_embeds[:, :, 1, :, :], axis=2) + + # make sinuniod function + # lon_sin, lon_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon_sin = np.sin(lon) + lon_cos = np.cos(lon) + + # lat_sin, lat_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lat_sin = np.sin(lat) + lat_cos = np.cos(lat) + + # spr_embeds_: shape (batch_size, num_context_pt, 1, frequency_num, 3) + spr_embeds_ = np.concatenate( + [ + lat_sin, + lat_cos * lon_single_cos, + lat_single_cos * lon_cos, + lat_cos * lon_single_sin, + lat_single_cos * lon_sin, + ], + axis=-1, + ) + + # (batch_size, num_context_pt, frequency_num*3) + spr_embeds = np.reshape(spr_embeds_, (batch_size, num_context_pt, -1)) + + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + + spr_embeds = self.make_output_embeds(coords) + + # # loop over all batches + # spr_embeds = [] + # for cur_batch in coords: + # # loop over N context points + # cur_embeds = [] + # for coords_tuple in cur_batch: + # cur_embeds.append(self.cal_coord_embed(coords_tuple)) + # spr_embeds.append(cur_embeds) + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + # sprenc: shape (batch_size, num_context_pt, spa_embed_dim) + # sprenc = torch.einsum("bnd,dk->bnk", (spr_embeds, self.post_mat)) + # sprenc = self.f_act(self.dropout(self.post_linear(spr_embeds))) + + return spr_embeds + + +class SphereMixScaleSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + device="cuda", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="SphereMixScaleSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.frequency_num = frequency_num + self.max_radius = max_radius + self.min_radius = min_radius + self.freq_init = freq_init + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + + self.position_encoder = SphereMixScaleSpatialRelationPositionEncoder( + coord_dim=coord_dim, + frequency_num=frequency_num, + max_radius=max_radius, + min_radius=min_radius, + freq_init=freq_init, + device=device, + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + + +class SphereGridMixScaleSpatialRelationPositionEncoder(PositionEncoder): + """ + Given a list of (deltaX,deltaY), encode them using the position encoding function + + """ + + def __init__( + self, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + device="cuda", + ): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + frequency_num: the number of different sinusoidal with different frequencies/wavelengths + max_radius: the largest context radius this model can handle + """ + super().__init__(coord_dim=coord_dim, device=device) + self.coord_dim = coord_dim + self.frequency_num = frequency_num + self.freq_init = freq_init + self.max_radius = max_radius + self.min_radius = min_radius + # the frequence we use for each block, alpha in ICLR paper + self.cal_freq_list() + self.cal_freq_mat() + + self.pos_enc_output_dim = self.cal_output_dim() + + def cal_elementwise_angle(self, coord, cur_freq): + """ + Args: + coord: the deltaX or deltaY + cur_freq: the frequency + """ + return coord / ( + np.power(self.max_radius, cur_freq * 1.0 / (self.frequency_num - 1)) + ) + + def cal_coord_embed(self, coords_tuple): + embed = [] + for coord in coords_tuple: + for cur_freq in range(self.frequency_num): + embed.append(math.sin(self.cal_elementwise_angle(coord, cur_freq))) + embed.append(math.cos(self.cal_elementwise_angle(coord, cur_freq))) + # embed: shape (pos_enc_output_dim) + return embed + + def cal_output_dim(self): + # compute the dimention of the encoded spatial relation embedding + return int( + 8 * self.frequency_num + ) # int(self.coord_dim * self.frequency_num * 2) + + def cal_freq_list(self): + if self.freq_init == "random": + # the frequence we use for each block, alpha in ICLR paper + # self.freq_list shape: (frequency_num) + self.freq_list = ( + np.random.random(size=[self.frequency_num]) * self.max_radius + ) + elif self.freq_init == "geometric": + self.freq_list = [] + for cur_freq in range(self.frequency_num): + base = 1.0 / ( + np.power(self.max_radius, cur_freq * 1.0 / (self.frequency_num - 1)) + ) + self.freq_list.append(base) + + self.freq_list = np.asarray(self.freq_list) + + def cal_freq_mat(self): + # freq_mat shape: (frequency_num, 1) + freq_mat = np.expand_dims(self.freq_list, axis=1) + # self.freq_mat shape: (frequency_num, 1) + self.freq_mat = freq_mat + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + # coords_mat: shape (batch_size, num_context_pt, 2, 1) + coords_mat = np.expand_dims(coords_mat, axis=3) + # coords_mat: shape (batch_size, num_context_pt, 2, 1, 1) + coords_mat = np.expand_dims(coords_mat, axis=4) + # coords_mat: shape (batch_size, num_context_pt, 2, frequency_num, 1) + coords_mat = np.repeat(coords_mat, self.frequency_num, axis=3) + + # convert to radius + coords_mat = coords_mat * math.pi / 180 + + # lon, lat: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon_single = np.expand_dims(coords_mat[:, :, 0, :, :], axis=2) + lat_single = np.expand_dims(coords_mat[:, :, 1, :, :], axis=2) + + # make sinuniod function + # lon_sin, lon_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon_single_sin = np.sin(lon_single) + lon_single_cos = np.cos(lon_single) + + # lat_sin, lat_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lat_single_sin = np.sin(lat_single) + lat_single_cos = np.cos(lat_single) + + # spr_embeds: shape (batch_size, num_context_pt, 2, frequency_num, 1) + spr_embeds = coords_mat * self.freq_mat + + # lon, lat: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon = np.expand_dims(spr_embeds[:, :, 0, :, :], axis=2) + lat = np.expand_dims(spr_embeds[:, :, 1, :, :], axis=2) + + # make sinuniod function + # lon_sin, lon_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon_sin = np.sin(lon) + lon_cos = np.cos(lon) + + # lat_sin, lat_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lat_sin = np.sin(lat) + lat_cos = np.cos(lat) + + # spr_embeds_: shape (batch_size, num_context_pt, 1, frequency_num, 3) + spr_embeds_ = np.concatenate( + [ + lat_sin, + lat_cos, + lon_sin, + lon_cos, + lat_cos * lon_single_cos, + lat_single_cos * lon_cos, + lat_cos * lon_single_sin, + lat_single_cos * lon_sin, + ], + axis=-1, + ) + + # (batch_size, num_context_pt, frequency_num*3) + spr_embeds = np.reshape(spr_embeds_, (batch_size, num_context_pt, -1)) + + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + + spr_embeds = self.make_output_embeds(coords) + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + # sprenc: shape (batch_size, num_context_pt, spa_embed_dim) + # sprenc = torch.einsum("bnd,dk->bnk", (spr_embeds, self.post_mat)) + # sprenc = self.f_act(self.dropout(self.post_linear(spr_embeds))) + + return spr_embeds + + +class SphereGridMixScaleSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + device="cuda", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="SphereGridMixScaleSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.frequency_num = frequency_num + self.max_radius = max_radius + self.min_radius = min_radius + self.freq_init = freq_init + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + + self.position_encoder = SphereGridMixScaleSpatialRelationPositionEncoder( + coord_dim=coord_dim, + frequency_num=frequency_num, + max_radius=max_radius, + min_radius=min_radius, + freq_init=freq_init, + device=device, + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + + +class DFTSpatialRelationPositionEncoder(PositionEncoder): + """ + Given a list of (deltaX,deltaY), encode them using the position encoding function + + """ + + def __init__( + self, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + device="cuda", + ): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + frequency_num: the number of different sinusoidal with different frequencies/wavelengths + max_radius: the largest context radius this model can handle + """ + super().__init__(coord_dim=coord_dim, device=device) + self.coord_dim = coord_dim + self.frequency_num = frequency_num + self.freq_init = freq_init + self.max_radius = max_radius + self.min_radius = min_radius + # the frequence we use for each block, alpha in ICLR paper + self.cal_freq_list() + self.cal_freq_mat() + + self.pos_enc_output_dim = self.cal_input_dim() + + def cal_elementwise_angle(self, coord, cur_freq): + """ + Args: + coord: the deltaX or deltaY + cur_freq: the frequency + """ + return coord / ( + np.power(self.max_radius, cur_freq * 1.0 / (self.frequency_num - 1)) + ) + + def cal_coord_embed(self, coords_tuple): + embed = [] + for coord in coords_tuple: + for cur_freq in range(self.frequency_num): + embed.append(math.sin(self.cal_elementwise_angle(coord, cur_freq))) + embed.append(math.cos(self.cal_elementwise_angle(coord, cur_freq))) + # embed: shape (pos_enc_output_dim) + return embed + + def cal_input_dim(self): + # compute the dimention of the encoded spatial relation embedding + return ( + self.frequency_num * 4 + 4 * self.frequency_num * self.frequency_num + ) # int(self.coord_dim * self.frequency_num * 2) + + def cal_freq_list(self): + if self.freq_init == "random": + # the frequence we use for each block, alpha in ICLR paper + # self.freq_list shape: (frequency_num) + self.freq_list = ( + np.random.random(size=[self.frequency_num]) * self.max_radius + ) + elif self.freq_init == "geometric": + self.freq_list = [] + for cur_freq in range(self.frequency_num): + base = 1.0 / ( + np.power(self.max_radius, cur_freq * 1.0 / (self.frequency_num - 1)) + ) + self.freq_list.append(base) + + self.freq_list = np.asarray(self.freq_list) + + def cal_freq_mat(self): + # freq_mat shape: (frequency_num, 1) + freq_mat = np.expand_dims(self.freq_list, axis=1) + # self.freq_mat shape: (frequency_num, 1) + self.freq_mat = freq_mat + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + # coords_mat: shape (batch_size, num_context_pt, 2, 1) + coords_mat = np.expand_dims(coords_mat, axis=3) + # coords_mat: shape (batch_size, num_context_pt, 2, 1, 1) + coords_mat = np.expand_dims(coords_mat, axis=4) + # coords_mat: shape (batch_size, num_context_pt, 2, frequency_num, 1) + coords_mat = np.repeat(coords_mat, self.frequency_num, axis=3) + + # convert to radius + coords_mat = coords_mat * math.pi / 180 + + # lon, lat: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon_single = np.expand_dims(coords_mat[:, :, 0, :, :], axis=2) + lat_single = np.expand_dims(coords_mat[:, :, 1, :, :], axis=2) + + # make sinuniod function + # lon_sin, lon_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon_single_sin = np.sin(lon_single) + lon_single_cos = np.cos(lon_single) + + # lat_sin, lat_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lat_single_sin = np.sin(lat_single) + lat_single_cos = np.cos(lat_single) + + # spr_embeds: shape (batch_size, num_context_pt, 2, frequency_num, 1) + spr_embeds = coords_mat * self.freq_mat + + # lon, lat: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon = np.expand_dims(spr_embeds[:, :, 0, :, :], axis=2) + lat = np.expand_dims(spr_embeds[:, :, 1, :, :], axis=2) + + # make sinuniod function + # lon_sin, lon_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lon_sin = np.sin(lon) + lon_cos = np.cos(lon) + + # lat_sin, lat_cos: shape (batch_size, num_context_pt, 1, frequency_num, 1) + lat_sin = np.sin(lat) + lat_cos = np.cos(lat) + + # lon_feq: shape (batch_size, num_context_pt, 1, 2*frequency_num, 1) + lon_feq = np.concatenate([lon_sin, lon_cos], axis=-2) + + # lat_feq: shape (batch_size, num_context_pt, 1, 2*frequency_num, 1) + lat_feq = np.concatenate([lat_sin, lat_cos], axis=-2) + + # lat_feq_t: shape (batch_size, num_context_pt, 1, 1, 2*frequency_num) + lat_feq_t = lat_feq.transpose(0, 1, 2, 4, 3) + + # coord_freq: shape (batch_size, num_context_pt, 1, 2*frequency_num, 2*frequency_num) + coord_freq = np.einsum("abcde,abcek->abcdk", lon_feq, lat_feq_t) + + # coord_freq_: shape (batch_size, num_context_pt, 2*frequency_num * 2*frequency_num) + coord_freq_ = np.reshape(coord_freq, (batch_size, num_context_pt, -1)) + + # coord_1: shape (batch_size, num_context_pt, 1, frequency_num, 4) + coord_1 = np.concatenate([lat_sin, lat_cos, lon_sin, lon_cos], axis=-1) + + # coord_1_: shape (batch_size, num_context_pt, frequency_num * 4) + coord_1_ = np.reshape(coord_1, (batch_size, num_context_pt, -1)) + + # spr_embeds_: shape (batch_size, num_context_pt, frequency_num * 4 + 4*frequency_num ^^2) + spr_embeds_ = np.concatenate([coord_freq_, coord_1_], axis=-1) + + # # make sinuniod function + # # sin for 2i, cos for 2i+1 + # # spr_embeds: (batch_size, num_context_pt, 2*frequency_num*2=pos_enc_output_dim) + # spr_embeds[:, :, :, :, 0::2] = np.sin(spr_embeds[:, :, :, :, 0::2]) # dim 2i + # spr_embeds[:, :, :, :, 1::2] = np.cos(spr_embeds[:, :, :, :, 1::2]) # dim 2i+1 + + # (batch_size, num_context_pt, frequency_num*3) + spr_embeds = np.reshape(spr_embeds_, (batch_size, num_context_pt, -1)) + + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + + spr_embeds = self.make_output_embeds(coords) + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + # sprenc: shape (batch_size, num_context_pt, spa_embed_dim) + # sprenc = torch.einsum("bnd,dk->bnk", (spr_embeds, self.post_mat)) + # sprenc = self.f_act(self.dropout(self.post_linear(spr_embeds))) + + # return sprenc + return spr_embeds + + +class DFTSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + coord_dim=2, + frequency_num=16, + max_radius=10000, + min_radius=10, + freq_init="geometric", + device="cuda", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="DFTSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.frequency_num = frequency_num + self.max_radius = max_radius + self.min_radius = min_radius + self.freq_init = freq_init + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + + self.position_encoder = DFTSpatialRelationPositionEncoder( + coord_dim=coord_dim, + frequency_num=frequency_num, + max_radius=max_radius, + min_radius=min_radius, + freq_init=freq_init, + device=device, + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + + +class RFFSpatialRelationPositionEncoder(PositionEncoder): + """ + Random Fourier Feature + Based on paper - Random Features for Large-Scale Kernel Machines + https://people.eecs.berkeley.edu/~brecht/papers/07.rah.rec.nips.pdf + + Given a list of (deltaX,deltaY), encode them using the position encoding function + + """ + + def __init__( + self, + coord_dim=2, + frequency_num=16, + rbf_kernel_size=1.0, + extent=None, + device="cuda", + ): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + extent: (x_min, x_max, y_min, y_max) + coord_dim: the dimention of space, 2D, 3D, or other + frequency_num: here, we understand it as the RFF embeding dimension before FNN + """ + super().__init__(coord_dim=coord_dim, device=device) + self.frequency_num = frequency_num + self.rbf_kernel_size = rbf_kernel_size + self.extent = extent + + self.generate_direction_vector() + + self.pos_enc_output_dim = self.frequency_num + + def generate_direction_vector(self): + """ + Generate K direction vector (omega) and shift vector (b) + Return: + dirvec: shape (coord_dim, frequency_num), omega in the paper + shift: shape (frequency_num), b in the paper + """ + # mean and covarance matrix of the Gaussian distribution + self.mean = np.zeros(self.coord_dim) + self.cov = np.diag(np.ones(self.coord_dim) * self.rbf_kernel_size) + # dirvec: shape (coord_dim, frequency_num), omega in the paper + dirvec = np.transpose( + np.random.multivariate_normal(self.mean, self.cov, self.frequency_num) + ) + self.dirvec = torch.nn.Parameter(torch.FloatTensor(dirvec), requires_grad=False) + self.register_parameter("dirvec", self.dirvec) + + # shift: shape (frequency_num), b in the paper + shift = np.random.uniform(0, 2 * np.pi, self.frequency_num) + self.shift = torch.nn.Parameter(torch.FloatTensor(shift), requires_grad=False) + self.register_parameter("shift", self.shift) + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for GridCellSpatialRelationEncoder" + ) + + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = np.asarray(coords).astype(float) + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = coord_normalize(coords_mat, self.extent) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = torch.FloatTensor(coords_mat).to(self.device) + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim = frequency_num) + spr_embeds = torch.matmul(coords_mat, self.dirvec) + + spr_embeds = torch.cos(spr_embeds + self.shift) * np.sqrt( + 2.0 / self.pos_enc_output_dim + ) + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim = frequency_num) + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim = frequency_num) + spr_embeds = self.make_output_embeds(coords) + + return spr_embeds + + +class RFFSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + coord_dim=2, + frequency_num=16, + rbf_kernel_size=1.0, + extent=None, + device="cuda", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="RFFSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.spa_embed_dim = spa_embed_dim + self.coord_dim = coord_dim + self.frequency_num = frequency_num + self.rbf_kernel_size = rbf_kernel_size + self.extent = extent + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + + self.position_encoder = RFFSpatialRelationPositionEncoder( + coord_dim=coord_dim, + frequency_num=frequency_num, + rbf_kernel_size=rbf_kernel_size, + extent=extent, + device=device, + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + + +class GridLookupSpatialRelationPositionEncoder(PositionEncoder): + """ + Given a list of (deltaX,deltaY), + divide the space into grids, each point is using the grid embedding it falls into + + """ + + def __init__( + self, + spa_embed_dim, + model_type="global", + max_radius=None, + coord_dim=2, + interval=1000000, + extent=[-180, 180, -90, 90], + device="cuda", + ): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + interval: the cell size in X and Y direction + extent: (left, right, bottom, top) + "global": the extent of the study area (-1710000, -1690000, 1610000, 1640000) + "relative": the extent of the relative context + """ + super().__init__(coord_dim=coord_dim, device=device) + self.spa_embed_dim = spa_embed_dim + self.interval = interval + self.model_type = model_type + self.max_radius = max_radius + assert extent[0] < extent[1] + assert extent[2] < extent[3] + + self.extent = self.get_spatial_context(model_type, extent, max_radius) + self.make_grid_embedding(self.interval, self.extent) + + self.pos_enc_output_dim = spa_embed_dim + + def get_spatial_context(self, model_type, extent, max_radius): + if model_type == "global": + extent = extent + elif model_type == "relative": + extent = ( + -max_radius - 500, + max_radius + 500, + -max_radius - 500, + max_radius + 500, + ) + return extent + + def make_grid_embedding(self, interval, extent): + self.num_col = int(math.ceil(float(extent[1] - extent[0]) / interval)) + self.num_row = int(math.ceil(float(extent[3] - extent[2]) / interval)) + + self.embedding = torch.nn.Embedding( + self.num_col * self.num_row, self.spa_embed_dim + ) + self.embedding.weight.data.normal_(0, 1.0 / self.spa_embed_dim) + + def make_output_embeds(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt=1, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, input_embed_dim) + """ + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception("Unknown coords data type for RBFSpatialRelationEncoder") + + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + + # x or y: shape (batch_size, num_context_pt) + x = coords_mat[:, :, 0] + y = coords_mat[:, :, 1] + + col = np.floor((x - self.extent[0]) / self.interval) + row = np.floor((y - self.extent[2]) / self.interval) + + # make sure each row/col index is within range + col = np.clip(col, 0, self.num_col - 1) + row = np.clip(row, 0, self.num_row - 1) + + # index_mat: shape (batch_size, num_context_pt) + index_mat = (row * self.num_col + col).astype(int) + # index_mat: shape (batch_size, num_context_pt) + index_mat = torch.LongTensor(index_mat).to(self.device) + + spr_embeds = self.embedding(torch.autograd.Variable(index_mat)) + # spr_embeds: shape (batch_size, num_context_pt, spa_embed_dim) + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + # spr_embeds: shape (batch_size, num_context_pt, spa_embed_dim) + spr_embeds = self.make_output_embeds(coords).to(self.device) + + # sprenc: shape (batch_size, num_context_pt, spa_embed_dim) + # sprenc = torch.einsum("bnd,dk->bnk", (spr_embeds, self.post_mat)) + + return spr_embeds + + +class GridLookupSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + extent, + interval, + coord_dim=2, + device="cuda", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="GridLookupSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.extent = extent + self.interval = (interval,) + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + + self.position_encoder = GridLookupSpatialRelationPositionEncoder( + spa_embed_dim=spa_embed_dim, + model_type="global", + coord_dim=coord_dim, + extent=extent, + device=device, + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + + +class AodhaFFTSpatialRelationPositionEncoder(PositionEncoder): + """ + Given a list of (deltaX,deltaY), + divide the space into grids, each point is using the grid embedding it falls into + + spa_enc_type == "geo_net_fft" + """ + + def __init__( + self, + extent, + coord_dim=2, + do_pos_enc=False, + do_global_pos_enc=True, + device="cuda", + ): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + if do_pos_enc == False: + coord_dim is the computed loc_feat according to get_model_input_feat_dim() + extent: (x_min, x_max, y_min, y_max) + do_pos_enc: True - we normalize the lat/lon, and [sin(pi*x), cos(pi*x), sin(pi*y), cos(pi*y)] + False - we assume the input is prenormalized coordinate features + do_global_pos_enc: if do_pos_enc == True: + True - lon/180 and lat/90 + False - min-max normalize based on extent + f_act: the final activation function, relu ot none + """ + super().__init__(coord_dim=coord_dim, device=device) + self.extent = extent + self.coord_dim = coord_dim + + self.do_pos_enc = do_pos_enc + self.do_global_pos_enc = do_global_pos_enc + self.pos_enc_output_dim = 4 + + def make_output_embeds(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt=1, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, pos_enc_output_dim) + """ + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + # coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception("Unknown coords data type for AodhaSpatialRelationEncoder") + + assert coords.shape[-1] == 2 + # coords: shape (batch_size, num_context_pt, 2) + # coords_mat: shape (batch_size, num_context_pt, 2) + coords_mat = coord_normalize( + coords, self.extent, do_global=self.do_global_pos_enc + ) + + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + + loc_sin = np.sin(math.pi * coords_mat) + loc_cos = np.cos(math.pi * coords_mat) + # spr_embeds: shape (batch_size, num_context_pt, 4) + spr_embeds = np.concatenate((loc_sin, loc_cos), axis=-1) + + # spr_embeds: shape (batch_size, num_context_pt, 4) + return spr_embeds + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + if self.do_pos_enc: + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim) + spr_embeds = self.make_output_embeds(coords) + # assert self.pos_enc_output_dim == np.shape(spr_embeds)[2] + else: + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim) + spr_embeds = coords + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + return spr_embeds + + +class AodhaFFNSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + extent, + coord_dim=2, + do_pos_enc=False, + do_global_pos_enc=True, + device="cuda", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="AodhaFFTSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.extent = extent + self.do_pos_enc = do_pos_enc + self.do_global_pos_enc = do_global_pos_enc + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + + self.position_encoder = AodhaFFTSpatialRelationPositionEncoder( + coord_dim=coord_dim, + extent=extent, + do_pos_enc=do_pos_enc, + do_global_pos_enc=do_global_pos_enc, + device=device, + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc + +class SphericalHarmonicsSpatialRelationPositionEncoder(PositionEncoder): + """ + Given a list of (lon,lat), convert them to (x,y,z), and then encode them using the MLP + + """ + + def __init__(self, coord_dim=2, legendre_poly_num=8, device="cuda"): + """ + Args: + spa_embed_dim: the output spatial relation embedding dimention + coord_dim: the dimention of space, 2D, 3D, or other + extent: (x_min, x_max, y_min, y_max) + """ + super().__init__(coord_dim=coord_dim, device=device) + + self.legendre_poly_num = legendre_poly_num + self.pos_enc_output_dim = legendre_poly_num**2 + + def make_output_embeds(self, coords): + if type(coords) == np.ndarray: + assert self.coord_dim == np.shape(coords)[2] + coords = list(coords) + elif type(coords) == list: + assert self.coord_dim == len(coords[0][0]) + else: + raise Exception( + "Unknown coords data type for SphericalHarmonicsSpatialRelationEncoder" + ) + + # (batch_size, num_context_pt, coord_dim) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + + # lon: (batch_size, num_context_pt, 1), convert from degree to radius + lon = np.deg2rad(coords_mat[:, :, :1]) + # lat: (batch_size, num_context_pt, 1), convert from degree to radius + lat = np.deg2rad(coords_mat[:, :, 1:]) + + # spr_embeds: (batch_size, num_context_pt, legendre_poly_num**2) + spr_embeds = get_positional_encoding(lon, lat, self.legendre_poly_num) + return spr_embeds.reshape((-1, self.pos_enc_output_dim)) + + def forward(self, coords): + """ + Given a list of coords (deltaX, deltaY), give their spatial relation embedding + Args: + coords: a python list with shape (batch_size, num_context_pt, coord_dim) + Return: + sprenc: Tensor shape (batch_size, num_context_pt, spa_embed_dim) + """ + # (batch_size, num_context_pt, coord_dim) + coords_mat = np.asarray(coords).astype(float) + batch_size = coords_mat.shape[0] + num_context_pt = coords_mat.shape[1] + + # spr_embeds: (batch_size, num_context_pt, 3) + spr_embeds = self.make_output_embeds(coords) + + # spr_embeds: shape (batch_size, num_context_pt, pos_enc_output_dim = 3) + spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) + + return spr_embeds + +class SphericalHarmonicsSpatialRelationLocationEncoder(LocationEncoder): + def __init__( + self, + spa_embed_dim, + coord_dim=2, + legendre_poly_num=8, + device="cuda", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="SphericalHarmonicsSpatialRelationEncoder", + ): + super().__init__(spa_embed_dim, coord_dim, device) + self.ffn_act = ffn_act + self.ffn_num_hidden_layers = ffn_num_hidden_layers + self.ffn_dropout_rate = ffn_dropout_rate + self.ffn_hidden_dim = ffn_hidden_dim + self.ffn_use_layernormalize = ffn_use_layernormalize + self.ffn_skip_connection = ffn_skip_connection + self.ffn_context_str = ffn_context_str + + self.position_encoder = SphericalHarmonicsSpatialRelationPositionEncoder( + coord_dim=coord_dim, legendre_poly_num=legendre_poly_num, device=device + ) + self.ffn = MultiLayerFeedForwardNN( + input_dim=self.position_encoder.pos_enc_output_dim, + output_dim=self.spa_embed_dim, + num_hidden_layers=self.ffn_num_hidden_layers, + dropout_rate=ffn_dropout_rate, + hidden_dim=self.ffn_hidden_dim, + activation=self.ffn_act, + use_layernormalize=self.ffn_use_layernormalize, + skip_connection=ffn_skip_connection, + context_str=ffn_context_str, + ) + + def forward(self, coords): + spr_embeds = self.position_encoder(coords) + sprenc = self.ffn(spr_embeds) + + return sprenc diff --git a/main/eva_sh/eva_rbf_nabirds.sh b/main/eva_sh/eva_rbf_nabirds.sh deleted file mode 100644 index 45427b62..00000000 --- a/main/eva_sh/eva_rbf_nabirds.sh +++ /dev/null @@ -1,46 +0,0 @@ -#!/bin/bash - -DIR=../models/rbf/ - -ENC=rbf - -DATA=nabirds -META=ebird_meta -EVALDATA=test - -DEVICE=cuda:3 - -LR=0.0005 -LAYER=1 -HIDDIM=512 -FREQ=64 -MINR=0.0005 -MAXR=1 -EPOCH=30 - -ACT=relu -RATIO=1.0 - -KERNELSIZE=2 -ANCHOR=200 - - -python3 train_unsuper.py \ - --spa_enc_type $ENC \ - --meta_type $META\ - --dataset $DATA \ - --eval_split $EVALDATA \ - --frequency_num $FREQ \ - --max_radius $MAXR \ - --min_radius $MINR \ - --num_hidden_layer $LAYER \ - --hidden_dim $HIDDIM \ - --spa_f_act $ACT \ - --unsuper_lr 0.1 \ - --lr $LR \ - --model_dir $DIR \ - --num_epochs $EPOCH \ - --train_sample_ratio $RATIO \ - --device $DEVICE \ - --rbf_kernel_size $KERNELSIZE \ - --num_rbf_anchor_pts $ANCHOR diff --git a/main/eva_sh/eva_rff_bird_meta.sh b/main/eva_sh/eva_rff_bird_meta.sh deleted file mode 100644 index a7272799..00000000 --- a/main/eva_sh/eva_rff_bird_meta.sh +++ /dev/null @@ -1,50 +0,0 @@ -#!/bin/bash - -DIR=../models/rff/ - -ENC=rff - -DATA=birdsnap -META=ebird_meta -EVALDATA=test - -DEVICE=cuda:1 - -LR=0.002 -LAYER=1 -HIDDIM=512 -FREQ=64 -MINR=0.000001 -MAXR=1 -KERNELSIZE=2 -################# Please set “--num_epochs” to be 0, because you do not want further train the model. ################# -EPOCH=30 - -ACT=relu -RATIO=1.0 - -################# Now you have a set of hyperparameter fixed, so cancel the loops ################# -################# Please set “–save_results” to be T AND “--load_super_model” to be T ################# - - -python3 train_unsuper.py \ - --save_results T\ - --load_super_model T\ - --spa_enc_type $ENC \ - --meta_type $META\ - --dataset $DATA \ - --eval_split $EVALDATA \ - --frequency_num $FREQ \ - --max_radius $MAXR \ - --min_radius $MINR \ - --num_hidden_layer $LAYER \ - --hidden_dim $HIDDIM \ - --spa_f_act $ACT \ - --unsuper_lr 0.1 \ - --lr $LR \ - --model_dir $DIR \ - --num_epochs $EPOCH \ - --train_sample_ratio $RATIO \ - --device $DEVICE \ - --rbf_kernel_size $KERNELSIZE - diff --git a/pre_process/dhs/rbf_regress.py b/pre_process/dhs/rbf_regress.py deleted file mode 100644 index 8cd1f6ff..00000000 --- a/pre_process/dhs/rbf_regress.py +++ /dev/null @@ -1,275 +0,0 @@ -import os -import numpy as np -import pandas as pd -from sklearn.metrics import r2_score - -import torch -from torch.utils.data import Dataset, DataLoader - -import torch.nn as nn -import torch.optim as optim - -from datetime import datetime - -import optuna -from optuna.pruners import MedianPruner -from optuna.trial import TrialState -import logging -from utils import save_checkpoint, load_checkpoint, RBFSpatialRelationPositionEncoder - -pd.set_option("display.max_columns", None) - -params = { - 'dataset_root_dir': '../../sustainbench/data/dhs', - 'label': 'asset_index_normalized', - 'checkpoint_dir': './checkpoints', - 'load_checkpoint': False, - 'batch_size': 512, - 'epochs': 50, - 'lr': 0.005, - 'num_rbf_anchor_pts': 13, - 'rbf_kernel_size':80, - 'model_type':"global", - 'device': 'cuda:0' -} - -# Add file handler to save logs to a file -current_time = datetime.now().strftime("%Y%m%d_%H%M%S") -log_file = os.path.join('logs', f'optuna_tuning_{params["label"]}_{current_time}.log') -logging.basicConfig(level=logging.INFO) -logger = logging.getLogger(__name__) -file_handler = logging.FileHandler(log_file) -file_handler.setLevel(logging.INFO) -formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') -file_handler.setFormatter(formatter) -logger.addHandler(file_handler) - -train_df = pd.read_csv(os.path.join(params['dataset_root_dir'], "dhs_trainval_labels.csv")) -val_df = pd.read_csv(os.path.join(params['dataset_root_dir'], "dhs_val_labels.csv")) -test_df = pd.read_csv(os.path.join(params['dataset_root_dir'], "dhs_test_labels.csv")) - -train_df = train_df.dropna(subset=['asset_index_normalized']) -val_df = val_df.dropna(subset=['asset_index_normalized']) -test_df = test_df.dropna(subset=['asset_index_normalized']) - -class RSDataset(Dataset): - def __init__(self, dataframe): - self.dataframe = dataframe - - def __len__(self): - return len(self.dataframe) - - def __getitem__(self, idx): - # Get the nl_mean directly from the dataframe - nl_mean = self.dataframe.iloc[idx]["nl_mean"] - label = self.dataframe.iloc[idx]["asset_index_normalized"] - if pd.isna(label): - return None - - return nl_mean, label - -train_dataset = RSDataset(train_df) -val_dataset = RSDataset(val_df) -test_dataset = RSDataset(test_df) - -train_loader = DataLoader(train_dataset, batch_size=params['batch_size'], shuffle=True) -val_loader = DataLoader(val_dataset, batch_size=params['batch_size'], shuffle=False) -test_loader = DataLoader(test_dataset, batch_size=params['batch_size'], shuffle=False) - - - -class MLP(nn.Module): - def __init__(self, train_dataset, device, num_rbf_anchor_pts=params['num_rbf_anchor_pts'], rbf_kernel_size=params['rbf_kernel_size']): - super(MLP, self).__init__() - self.position_encoder = RBFSpatialRelationPositionEncoder(train_locs=train_dataset, num_rbf_anchor_pts=num_rbf_anchor_pts, rbf_kernel_size=rbf_kernel_size, device=device) - self.model = nn.Sequential( - nn.Linear(self.position_encoder.pos_enc_output_dim, 64), - nn.ReLU(), - nn.Linear(64, 32), - nn.ReLU(), - nn.Linear(32, 1) - ) - - def forward(self, x): - loc_embed = self.position_encoder(x) - return self.model(loc_embed) - -nl_mean_list = [] -for batch in train_loader: - nl_mean_batch, _ = batch - nl_mean_list.extend(nl_mean_batch.numpy()) - -### train_nl_mean_array is a tensor of shape (num_train, 1) -train_nl_mean_array = np.array(nl_mean_list).reshape(-1, 1) - -# model = MLP(train_dataset=train_nl_mean_array, device=params['device']) -# model = model.to(params['device']) - -# criterion = nn.MSELoss() -# optimizer = optim.Adam(model.parameters(), lr=params['lr']) - -# best_val_loss = float("inf") - -# start_epoch = 0 -# if params['load_checkpoint']: -# model, optimizer, best_val_loss, start_epoch = load_checkpoint(model, optimizer, params['checkpoint_dir']) - -# num_epochs = params['epochs'] - -def objective(trial): - num_rbf_anchor_pts = trial.suggest_int('num_rbf_anchor_pts', 1, 30) - rbf_kernel_size = trial.suggest_int('rbf_kernel_size', 2, 50) - lr = trial.suggest_loguniform('lr', 1e-5, 1e-1) - - model = MLP(train_dataset=train_nl_mean_array, device=params['device'], - num_rbf_anchor_pts=num_rbf_anchor_pts, rbf_kernel_size=rbf_kernel_size) - model = model.to(params['device']) - - criterion = nn.MSELoss() - optimizer = optim.Adam(model.parameters(), lr=lr) - - num_epochs = 30 # Use fewer epochs for tuning - for epoch in range(num_epochs): - model.train() - train_loss = 0.0 - for batch_idx, (nl_means, labels) in enumerate(train_loader): - nl_means, labels = nl_means.to(params['device']), labels.to(params['device']).float() - optimizer.zero_grad() - nl_means = nl_means.reshape(nl_means.size(0), 1, 1) - outputs = model(nl_means.cpu().numpy()) - loss = criterion(outputs.squeeze(), labels) - loss.backward() - optimizer.step() - train_loss += loss.item() * nl_means.size(0) - - train_loss /= len(train_loader.dataset) - - model.eval() - val_loss = 0.0 - with torch.no_grad(): - for nl_means, labels in val_loader: - nl_means, labels = nl_means.to(params['device']), labels.to(params['device']).float() - nl_means = nl_means.reshape(nl_means.size(0), 1, 1) - outputs = model(nl_means.cpu().numpy()) - loss = criterion(outputs.squeeze(), labels) - val_loss += loss.item() * nl_means.size(0) - - val_loss /= len(val_loader.dataset) - - logger.info(f"Trial {trial.number}, Epoch {epoch+1}/{num_epochs}, " - f"Training Loss: {train_loss:.4f}, Validation Loss: {val_loss:.4f}") - - trial.report(val_loss, epoch) - - if trial.should_prune(): - raise optuna.exceptions.TrialPruned() - - # Calculate R² score on the test set - model.eval() - all_preds = [] - all_labels = [] - with torch.no_grad(): - for nl_means, labels in test_loader: - nl_means, labels = nl_means.to(params['device']), labels.to(params['device']).float() - nl_means = nl_means.reshape(nl_means.size(0), 1, 1) - outputs = model(nl_means.cpu().numpy()) - all_preds.append(outputs.cpu().numpy()) - all_labels.append(labels.cpu().numpy()) - - all_preds = np.concatenate(all_preds) - all_labels = np.concatenate(all_labels) - r2 = r2_score(all_labels, all_preds) - - logger.info(f"Trial {trial.number}, Test R²: {r2:.4f}") - - return r2 - -pruner = MedianPruner(n_startup_trials=5, n_warmup_steps=5, interval_steps=1) -study = optuna.create_study(direction='maximize', pruner=pruner) -study.optimize(objective, n_trials=500, timeout=7200) - -logger.info("Number of finished trials: %d", len(study.trials)) -logger.info("Best trial:") - -trial = study.best_trial - -logger.info(" Value: %f", trial.value) -logger.info(" Params: ") -for key, value in trial.params.items(): - logger.info(" %s: %s", key, value) - -# for epoch in range(start_epoch, num_epochs): -# model.train() -# train_loss = 0.0 -# for batch_idx, (nl_means, labels) in enumerate(train_loader): -# nl_means, labels = nl_means.to(params['device']), labels.to(params['device']).float() -# optimizer.zero_grad() -# # nl_means shape is (batch_size, 1, 1) -# nl_means = nl_means.reshape(nl_means.size(0), 1, 1) -# outputs = model(nl_means.cpu().numpy()) -# loss = criterion(outputs.squeeze(), labels) -# loss.backward() -# optimizer.step() -# train_loss += loss.item() * nl_means.size(0) - -# if batch_idx % 10 == 0: -# print(f"Epoch {epoch+1}/{params['epochs']}, Batch {batch_idx+1}/{len(train_loader)}, Training Loss: {loss.item():.4f}") - -# train_loss /= len(train_loader.dataset) - -# model.eval() -# val_loss = 0.0 -# all_preds = [] -# all_labels = [] -# with torch.no_grad(): -# for nl_means, labels in test_loader: -# nl_means, labels = nl_means.to(params['device']), labels.to(params['device']).float() -# nl_means = nl_means.reshape(nl_means.size(0), 1, 1) -# outputs = model(nl_means.cpu().numpy()) -# loss = criterion(outputs.squeeze(), labels) -# val_loss += loss.item() * nl_means.size(0) -# all_preds.append(outputs.cpu().numpy()) -# all_labels.append(labels.cpu().numpy()) - -# val_loss /= len(test_loader.dataset) - -# all_preds = np.concatenate(all_preds) -# all_labels = np.concatenate(all_labels) - -# r2 = r2_score(all_labels, all_preds) - -# print(f"Epoch {epoch+1}/{num_epochs}, Training Loss: {train_loss}, Test Loss: {val_loss}, R²: {r2:.4f}") - -# is_best = val_loss < best_val_loss -# best_val_loss = min(val_loss, best_val_loss) - -# save_checkpoint({ -# 'epoch': epoch + 1, -# 'state_dict': model.state_dict(), -# 'best_val_loss': best_val_loss, -# 'optimizer': optimizer.state_dict(), -# }, is_best, params['checkpoint_dir']) - -# model.eval() -# test_loss = 0.0 -# all_preds = [] -# all_labels = [] -# with torch.no_grad(): -# for batch_idx, (inputs, labels) in enumerate(test_loader): -# inputs, labels = inputs.to(params['device']), labels.to(params['device']).float().unsqueeze(1) -# inputs = inputs.reshape(inputs.size(0), 1, 1) -# outputs = model(inputs.cpu().numpy()) -# loss = criterion(outputs, labels) -# test_loss += loss.item() * inputs.size(0) -# all_preds.append(outputs.cpu().numpy()) -# all_labels.append(labels.cpu().numpy()) - -# if batch_idx % 10 == 0: -# print(f"Test Batch {batch_idx+1}/{len(test_loader)}, Loss: {loss.item():.4f}") - -# test_loss /= len(test_loader.dataset) -# all_preds = np.concatenate(all_preds) -# all_labels = np.concatenate(all_labels) - -# r2 = r2_score(all_labels, all_preds) -# print(f"Test Loss: {test_loss:.4f}, R²: {r2:.4f}") diff --git a/pre_process/dhs/rbf_regress_optuna.py b/pre_process/dhs/rbf_regress_optuna.py deleted file mode 100644 index 0a2ad4b6..00000000 --- a/pre_process/dhs/rbf_regress_optuna.py +++ /dev/null @@ -1,218 +0,0 @@ -import os -import numpy as np -import pandas as pd -from sklearn.metrics import r2_score - -import torch -from torch.utils.data import Dataset, DataLoader - -import torch.nn as nn -import torch.optim as optim - -from datetime import datetime - -import optuna -from optuna.pruners import MedianPruner -from optuna.trial import TrialState -import logging -from utils import save_checkpoint, load_checkpoint, RBFFeaturePositionEncoder - -pd.set_option("display.max_columns", None) - -import argparse - -# Define the command-line arguments -parser = argparse.ArgumentParser(description="Hyperparameter Optimization with Optuna") -parser.add_argument('--dataset_root_dir', type=str, default='../../sustainbench/data/dhs', help='Root directory of the dataset') -parser.add_argument('--checkpoint_dir', type=str, default='./checkpoints', help='Directory to save checkpoints') -parser.add_argument('--load_checkpoint', action='store_true', help='Load checkpoint if available') -parser.add_argument('--batch_size', type=int, default=512, help='Batch size for training') -parser.add_argument('--epochs', type=int, default=100, help='Number of epochs for training') -parser.add_argument('--lr', type=float, default=0.005, help='Learning rate') -parser.add_argument('--num_rbf_anchor_pts', type=int, default=13, help='Number of RBF anchor points') -parser.add_argument('--rbf_kernel_size', type=int, default=80, help='RBF kernel size') -parser.add_argument('--model_type', type=str, default='global', help='Type of model') -parser.add_argument('--device', type=str, default='cuda:0', help='Device to use for training') -parser.add_argument('--label', type=str, default='experiment_1', help='Label for the experiment') - -args = parser.parse_args() -params = { - 'dataset_root_dir': args.dataset_root_dir, - 'checkpoint_dir': args.checkpoint_dir, - 'load_checkpoint': args.load_checkpoint, - 'batch_size': args.batch_size, - 'epochs': args.epochs, - 'lr': args.lr, - 'num_rbf_anchor_pts': args.num_rbf_anchor_pts, - 'rbf_kernel_size': args.rbf_kernel_size, - 'model_type': args.model_type, - 'device': args.device, - 'label': args.label -} - -# Add file handler to save logs to a file -current_time = datetime.now().strftime("%Y%m%d_%H%M%S") -log_file = os.path.join('logs', f'optuna_tuning_{params["label"]}_{current_time}.log') -logging.basicConfig(level=logging.INFO) -logger = logging.getLogger(__name__) -file_handler = logging.FileHandler(log_file) -file_handler.setLevel(logging.INFO) -formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s') -file_handler.setFormatter(formatter) -logger.addHandler(file_handler) - -train_df = pd.read_csv(os.path.join(params['dataset_root_dir'], "dhs_trainval_labels.csv")) -val_df = pd.read_csv(os.path.join(params['dataset_root_dir'], "dhs_val_labels.csv")) -test_df = pd.read_csv(os.path.join(params['dataset_root_dir'], "dhs_test_labels.csv")) - -train_df = train_df.dropna(subset=[params['label']]) -val_df = val_df.dropna(subset=[params['label']]) -test_df = test_df.dropna(subset=[params['label']]) - -class RSDataset(Dataset): - def __init__(self, dataframe): - self.dataframe = dataframe - - def __len__(self): - return len(self.dataframe) - - def __getitem__(self, idx): - # Get the nl_mean directly from the dataframe - nl_mean = self.dataframe.iloc[idx]["nl_mean"] - label = self.dataframe.iloc[idx][params['label']] - if pd.isna(label): - return None - - return nl_mean, label - -train_dataset = RSDataset(train_df) -val_dataset = RSDataset(val_df) -test_dataset = RSDataset(test_df) - -train_loader = DataLoader(train_dataset, batch_size=params['batch_size'], shuffle=True) -val_loader = DataLoader(val_dataset, batch_size=params['batch_size'], shuffle=False) -test_loader = DataLoader(test_dataset, batch_size=params['batch_size'], shuffle=False) - - - -class MLP(nn.Module): - def __init__(self, train_dataset, device, num_rbf_anchor_pts, rbf_kernel_size, layers, neurons, act_func): - super(MLP, self).__init__() - self.position_encoder = RBFFeaturePositionEncoder( - train_locs=train_dataset, - num_rbf_anchor_pts=num_rbf_anchor_pts, - rbf_kernel_size=rbf_kernel_size, - device=device - ) - self.model = self._build_model(self.position_encoder.pos_enc_output_dim, layers, neurons, act_func) - - def _build_model(self, input_dim, layers, neurons, act_func): - modules = [] - for i in range(layers): - modules.append(nn.Linear(input_dim if i == 0 else neurons[i-1], neurons[i])) - modules.append(act_func) - modules.append(nn.Linear(neurons[-1], 1)) - return nn.Sequential(*modules) - - def forward(self, x): - loc_embed = self.position_encoder(x) - return self.model(loc_embed) - -nl_mean_list = [] -for batch in train_loader: - nl_mean_batch, _ = batch - nl_mean_list.extend(nl_mean_batch.numpy()) - -### train_nl_mean_array is a tensor of shape (num_train, 1) -train_nl_mean_array = np.array(nl_mean_list).reshape(-1, 1) - -# Define the objective function for Optuna -def objective(trial): - num_rbf_anchor_pts = trial.suggest_int('num_rbf_anchor_pts', 1, 30) - rbf_kernel_size = trial.suggest_int('rbf_kernel_size', 2, 50) - lr = trial.suggest_loguniform('lr', 1e-5, 1e-1) - - # Hyperparameters for the MLP - layers = trial.suggest_int('layers', 1, 3) - neurons = [trial.suggest_int(f'neurons_l{i}', 8, 128) for i in range(layers)] - activation_choices = {'ReLU': nn.ReLU(), 'Tanh': nn.Tanh(), 'LeakyReLU': nn.LeakyReLU()} - activation_name = trial.suggest_categorical('activation', list(activation_choices.keys())) - activation = activation_choices[activation_name] - - model = MLP(train_dataset=train_nl_mean_array, device=params['device'], - num_rbf_anchor_pts=num_rbf_anchor_pts, rbf_kernel_size=rbf_kernel_size, - layers=layers, neurons=neurons, act_func=activation) - model = model.to(params['device']) - - criterion = nn.MSELoss() - optimizer = optim.Adam(model.parameters(), lr=lr) - - num_epochs = 1 # Use fewer epochs for tuning - for epoch in range(num_epochs): - model.train() - train_loss = 0.0 - for batch_idx, (nl_means, labels) in enumerate(train_loader): - nl_means, labels = nl_means.to(params['device']), labels.to(params['device']).float() - optimizer.zero_grad() - nl_means = nl_means.reshape(nl_means.size(0), 1, 1) - outputs = model(nl_means.cpu().numpy()) - loss = criterion(outputs.squeeze(), labels) - loss.backward() - optimizer.step() - train_loss += loss.item() * nl_means.size(0) - - train_loss /= len(train_loader.dataset) - - model.eval() - val_loss = 0.0 - with torch.no_grad(): - for nl_means, labels in val_loader: - nl_means, labels = nl_means.to(params['device']), labels.to(params['device']).float() - nl_means = nl_means.reshape(nl_means.size(0), 1, 1) - outputs = model(nl_means.cpu().numpy()) - loss = criterion(outputs.squeeze(), labels) - val_loss += loss.item() * nl_means.size(0) - - val_loss /= len(val_loader.dataset) - - logger.info(f"Trial {trial.number}, Epoch {epoch+1}/{num_epochs}, " - f"Training Loss: {train_loss:.4f}, Validation Loss: {val_loss:.4f}") - - trial.report(val_loss, epoch) - - if trial.should_prune(): - raise optuna.exceptions.TrialPruned() - - # Calculate R² score on the test set - model.eval() - all_preds = [] - all_labels = [] - with torch.no_grad(): - for nl_means, labels in test_loader: - nl_means, labels = nl_means.to(params['device']), labels.to(params['device']).float() - nl_means = nl_means.reshape(nl_means.size(0), 1, 1) - outputs = model(nl_means.cpu().numpy()) - all_preds.append(outputs.cpu().numpy()) - all_labels.append(labels.cpu().numpy()) - - all_preds = np.concatenate(all_preds) - all_labels = np.concatenate(all_labels) - r2 = r2_score(all_labels, all_preds) - - logger.info(f"Trial {trial.number}, Test R²: {r2:.4f}") - - return r2 - -pruner = MedianPruner(n_startup_trials=100, n_warmup_steps=10, interval_steps=5) -study = optuna.create_study(direction='maximize', pruner=pruner) -study.optimize(objective, n_trials=500, timeout=50400) # 50400 seconds = 14 hours - -logger.info("Number of finished trials: %d", len(study.trials)) -logger.info("Best trial:") - -trial = study.best_trial - -logger.info(" Value: %f", trial.value) -logger.info(" Params: ") -for key, value in trial.params.items(): - logger.info(" %s: %s", key, value) diff --git a/pre_process/dhs/utils.py b/pre_process/dhs/utils.py deleted file mode 100644 index 9feca75b..00000000 --- a/pre_process/dhs/utils.py +++ /dev/null @@ -1,172 +0,0 @@ -import os -import numpy as np -import pandas as pd - -import torch -import torch.nn as nn - - -def save_checkpoint(state, is_best, checkpoint_dir, filename='checkpoint_rbf.pth.tar'): - filepath = os.path.join(checkpoint_dir, filename) - torch.save(state, filepath) - if is_best: - best_filepath = os.path.join(checkpoint_dir, 'best_checkpoint_rbf.pth.tar') - torch.save(state, best_filepath) - -def load_checkpoint(model, optimizer, checkpoint_dir, filename='best_checkpoint_rbf.pth.tar'): - filepath = os.path.join(checkpoint_dir, filename) - if os.path.isfile(filepath): - print(f"Loading checkpoint '{filepath}'") - checkpoint = torch.load(filepath) - model.load_state_dict(checkpoint['state_dict']) - optimizer.load_state_dict(checkpoint['optimizer']) - best_val_loss = checkpoint['best_val_loss'] - start_epoch = checkpoint['epoch'] - print(f"Loaded checkpoint '{filepath}' (epoch {start_epoch})") - return model, optimizer, best_val_loss, start_epoch - else: - print(f"No checkpoint found at '{filepath}'") - return model, optimizer, best_val_loss, 0 - - -class RBFFeaturePositionEncoder(nn.Module): - """ - Given a list of values, compute the distance from each point to each RBF anchor point. - Feed into an MLP. - This is for global position encoding or relative/spatial context position encoding. - """ - - def __init__( - self, - train_locs, - coord_dim=1, - num_rbf_anchor_pts=100, - rbf_kernel_size=10e2, - rbf_kernel_size_ratio=0.0, - model_type="global", - max_radius=10000, - rbf_anchor_pt_ids=None, - device="cuda", - ): - """ - Args: - train_locs: np.array, [batch_size], location data - num_rbf_anchor_pts: the number of RBF anchor points - rbf_kernel_size: the RBF kernel size - rbf_kernel_size_ratio: if not None, different anchor points have different kernel size - max_radius: the relative spatial context size in spatial context model - """ - super(RBFFeaturePositionEncoder, self).__init__() - self.coord_dim = coord_dim - self.model_type = model_type - self.train_locs = train_locs.values if isinstance(train_locs, pd.Series) else train_locs - self.num_rbf_anchor_pts = num_rbf_anchor_pts - self.rbf_kernel_size = rbf_kernel_size - self.rbf_kernel_size_ratio = rbf_kernel_size_ratio - self.max_radius = max_radius - self.rbf_anchor_pt_ids = rbf_anchor_pt_ids - self.device = device - - # Calculate the coordinate matrix for each RBF anchor point - self.cal_rbf_anchor_coord_mat() - - self.pos_enc_output_dim = self.num_rbf_anchor_pts - # print(f"Position encoding output dimension: {self.pos_enc_output_dim}") - - def _random_sampling(self, item_tuple, num_sample): - """ - Randomly sample a given number of items. - """ - type_list = list(item_tuple) - if len(type_list) > num_sample: - return list(np.random.choice(type_list, num_sample, replace=False)) - elif len(type_list) == num_sample: - return item_tuple - else: - return list(np.random.choice(type_list, num_sample, replace=True)) - - def cal_rbf_anchor_coord_mat(self): - if self.model_type == "global": - assert self.rbf_kernel_size_ratio == 0 - # If we do RBF on location/global model, - # we need to random sample M RBF anchor points from training point dataset - if self.rbf_anchor_pt_ids == None: - self.rbf_anchor_pt_ids = self._random_sampling( - np.arange(len(self.train_locs)), self.num_rbf_anchor_pts - ) - - self.rbf_coords_mat = self.train_locs[self.rbf_anchor_pt_ids] - - elif self.model_type == "relative": - # If we do RBF on spatial context/relative model, - # We just ra ndom sample M-1 RBF anchor point in the relative spatial context defined by max_radius - # The (0,0) is also an anchor point - x_list = np.random.uniform( - -self.max_radius, self.max_radius, self.num_rbf_anchor_pts - ) - x_list[0] = 0.0 - y_list = np.random.uniform( - -self.max_radius, self.max_radius, self.num_rbf_anchor_pts - ) - y_list[0] = 0.0 - # self.rbf_coords: (num_rbf_anchor_pts, 2) - self.rbf_coords_mat = np.transpose(np.stack([x_list, y_list], axis=0)) - - if self.rbf_kernel_size_ratio > 0: - dist_mat = np.sqrt(np.sum(np.power(self.rbf_coords_mat, 2), axis=-1)) - # rbf_kernel_size_mat: (num_rbf_anchor_pts) - self.rbf_kernel_size_mat = ( - dist_mat * self.rbf_kernel_size_ratio + self.rbf_kernel_size - ) - - def make_output_embeds(self, coords): - """ - Given a list of coords (deltaX, deltaY), give their spatial relation embedding - Args: - coords: a python list with shape (batch_size, num_context_pt=1, coord_dim) - Return: - sprenc: Tensor shape (batch_size, num_context_pt, pos_enc_output_dim) - """ - if type(coords) == np.ndarray: - assert self.coord_dim == np.shape(coords)[2] - #print("coords",coords.shape) - coords = list(coords) - elif type(coords) == list: - assert self.coord_dim == len(coords[0][0]) - else: - print("coords type",type(coords)) - raise Exception("Unknown coords data type for RBFSpatialRelationEncoder") - - coords_mat = np.asarray(coords).astype(float) - #print("coords_mat1",coords_mat.shape) - batch_size = coords_mat.shape[0] - num_context_pt = coords_mat.shape[1] - - coords_mat = np.repeat(coords_mat, self.num_rbf_anchor_pts, axis=1) - #print("coords_mat2",coords_mat.shape) - coords_mat = coords_mat - self.rbf_coords_mat.T - #print("coords_mat3",coords_mat.shape) - coords_mat = np.sum(np.power(coords_mat, 2), axis=-1) - #print("coords_mat4",coords_mat.shape) - - if self.rbf_kernel_size_ratio > 0: - spr_embeds = np.exp( - (-1 * coords_mat) / (2.0 * np.power(self.rbf_kernel_size_mat, 2)) - ) - else: - spr_embeds = np.exp( - (-1 * coords_mat) / (2.0 * np.power(self.rbf_kernel_size, 2)) - ) - return spr_embeds - - def forward(self, coords): - """ - Given a list of coordinates, compute their spatial relation embedding. - Args: - coords: a list or array with shape (batch_size, num_context_pt=1, coord_dim) - Return: - spr_embeds: Tensor with shape (batch_size, num_context_pt, spa_embed_dim) - """ - spr_embeds = self.make_output_embeds(coords) - spr_embeds = torch.FloatTensor(spr_embeds).to(self.device) - return spr_embeds diff --git a/pre_trained_models/.DS_Store b/pre_trained_models/.DS_Store new file mode 100644 index 00000000..5afbc984 Binary files /dev/null and b/pre_trained_models/.DS_Store differ diff --git a/pre_trained_models/space2vec_grid/model_birdsnap_ebird_meta_Space2Vec-grid_inception_v3_0.0100_128_0.1000000_360.000_1_512_leakyrelu.log b/pre_trained_models/space2vec_grid/model_birdsnap_ebird_meta_Space2Vec-grid_inception_v3_0.0100_128_0.1000000_360.000_1_512_leakyrelu.log new file mode 100755 index 00000000..e24dee33 --- /dev/null +++ b/pre_trained_models/space2vec_grid/model_birdsnap_ebird_meta_Space2Vec-grid_inception_v3_0.0100_128_0.1000000_360.000_1_512_leakyrelu.log @@ -0,0 +1,190 @@ +2024-05-21 03:45:18,884 - INFO - +num_classes 500 +2024-05-21 03:45:18,884 - INFO - num train 42490 +2024-05-21 03:45:18,884 - INFO - num val 980 +2024-05-21 03:45:18,884 - INFO - train loss full_loss +2024-05-21 03:45:18,884 - INFO - model name ../models/space2vec_grid/model_birdsnap_ebird_meta_Space2Vec-grid_inception_v3_0.0100_128_0.1000000_360.000_1_512_leakyrelu.pth.tar +2024-05-21 03:45:18,884 - INFO - num users 5763 +2024-05-21 03:45:18,884 - INFO - meta data ebird_meta +2024-05-21 03:45:19,692 - INFO - +Epoch 0 +2024-05-21 03:45:52,280 - INFO - [41984/42490] Loss : 1.2121 +2024-05-21 03:45:52,914 - INFO - Test loss : 0.2807 +2024-05-21 03:45:52,914 - INFO - +Epoch 1 +2024-05-21 03:46:24,851 - INFO - [41984/42490] Loss : 0.9268 +2024-05-21 03:46:25,578 - INFO - Test loss : 0.3114 +2024-05-21 03:46:25,578 - INFO - +Epoch 2 +2024-05-21 03:46:57,679 - INFO - [41984/42490] Loss : 0.8627 +2024-05-21 03:46:58,411 - INFO - Test loss : 0.3143 +2024-05-21 03:46:58,411 - INFO - +Epoch 3 +2024-05-21 03:47:31,267 - INFO - [41984/42490] Loss : 0.8333 +2024-05-21 03:47:31,995 - INFO - Test loss : 0.3199 +2024-05-21 03:47:31,995 - INFO - +Epoch 4 +2024-05-21 03:48:04,141 - INFO - [41984/42490] Loss : 0.8134 +2024-05-21 03:48:04,870 - INFO - Test loss : 0.2868 +2024-05-21 03:48:04,870 - INFO - +Epoch 5 +2024-05-21 03:48:37,061 - INFO - [41984/42490] Loss : 0.7981 +2024-05-21 03:48:37,810 - INFO - Test loss : 0.3009 +2024-05-21 03:48:37,810 - INFO - +Epoch 6 +2024-05-21 03:49:10,793 - INFO - [41984/42490] Loss : 0.7865 +2024-05-21 03:49:11,498 - INFO - Test loss : 0.2926 +2024-05-21 03:49:11,498 - INFO - +Epoch 7 +2024-05-21 03:49:43,632 - INFO - [41984/42490] Loss : 0.7753 +2024-05-21 03:49:44,362 - INFO - Test loss : 0.3062 +2024-05-21 03:49:44,362 - INFO - +Epoch 8 +2024-05-21 03:50:16,502 - INFO - [41984/42490] Loss : 0.7632 +2024-05-21 03:50:17,233 - INFO - Test loss : 0.3451 +2024-05-21 03:50:17,233 - INFO - +Epoch 9 +2024-05-21 03:50:49,422 - INFO - [41984/42490] Loss : 0.7537 +2024-05-21 03:50:50,154 - INFO - Test loss : 0.3492 +2024-05-21 03:50:50,154 - INFO - +Epoch 10 +2024-05-21 03:51:22,344 - INFO - [41984/42490] Loss : 0.7468 +2024-05-21 03:51:23,076 - INFO - Test loss : 0.3722 +2024-05-21 03:51:23,093 - INFO - (980,) +2024-05-21 03:51:23,130 - INFO - Split ID: 0 +2024-05-21 03:51:23,130 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 5.31 +2024-05-21 03:51:23,131 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 12.96 +2024-05-21 03:51:23,131 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 18.27 +2024-05-21 03:51:23,131 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 28.16 +2024-05-21 03:51:23,131 - INFO - +No prior +2024-05-21 03:51:23,132 - INFO - (2262,) +2024-05-21 03:51:23,183 - INFO - Split ID: 0 +2024-05-21 03:51:23,183 - INFO - Top 1 (Epoch 10)acc (%): 70.07 +2024-05-21 03:51:23,183 - INFO - Top 3 (Epoch 10)acc (%): 86.6 +2024-05-21 03:51:23,183 - INFO - Top 5 (Epoch 10)acc (%): 90.05 +2024-05-21 03:51:23,183 - INFO - Top 10 (Epoch 10)acc (%): 92.88 +2024-05-21 03:51:25,412 - INFO - Split ID: 0 +2024-05-21 03:51:25,412 - INFO - Top 1 (Epoch 10)acc (%): 79.44 +2024-05-21 03:51:25,412 - INFO - Top 3 (Epoch 10)acc (%): 91.47 +2024-05-21 03:51:25,412 - INFO - Top 5 (Epoch 10)acc (%): 93.77 +2024-05-21 03:51:25,412 - INFO - Top 10 (Epoch 10)acc (%): 95.76 +2024-05-21 03:51:25,412 - INFO - +Epoch 11 +2024-05-21 03:51:57,281 - INFO - [41984/42490] Loss : 0.7383 +2024-05-21 03:51:58,011 - INFO - Test loss : 0.4009 +2024-05-21 03:51:58,011 - INFO - +Epoch 12 +2024-05-21 03:52:30,194 - INFO - [41984/42490] Loss : 0.7323 +2024-05-21 03:52:30,923 - INFO - Test loss : 0.3713 +2024-05-21 03:52:30,923 - INFO - +Epoch 13 +2024-05-21 03:53:03,036 - INFO - [41984/42490] Loss : 0.7255 +2024-05-21 03:53:03,729 - INFO - Test loss : 0.3680 +2024-05-21 03:53:03,729 - INFO - +Epoch 14 +2024-05-21 03:53:35,947 - INFO - [41984/42490] Loss : 0.7179 +2024-05-21 03:53:36,677 - INFO - Test loss : 0.4003 +2024-05-21 03:53:36,677 - INFO - +Epoch 15 +2024-05-21 03:54:08,846 - INFO - [41984/42490] Loss : 0.7132 +2024-05-21 03:54:09,573 - INFO - Test loss : 0.4290 +2024-05-21 03:54:09,573 - INFO - +Epoch 16 +2024-05-21 03:54:41,666 - INFO - [41984/42490] Loss : 0.7088 +2024-05-21 03:54:42,397 - INFO - Test loss : 0.4313 +2024-05-21 03:54:42,397 - INFO - +Epoch 17 +2024-05-21 03:55:14,476 - INFO - [41984/42490] Loss : 0.7006 +2024-05-21 03:55:15,206 - INFO - Test loss : 0.4718 +2024-05-21 03:55:15,207 - INFO - +Epoch 18 +2024-05-21 03:55:47,439 - INFO - [41984/42490] Loss : 0.6996 +2024-05-21 03:55:48,169 - INFO - Test loss : 0.4784 +2024-05-21 03:55:48,169 - INFO - +Epoch 19 +2024-05-21 03:56:21,015 - INFO - [41984/42490] Loss : 0.6951 +2024-05-21 03:56:21,523 - INFO - Test loss : 0.4522 +2024-05-21 03:56:21,523 - INFO - +Epoch 20 +2024-05-21 03:56:54,638 - INFO - [41984/42490] Loss : 0.6908 +2024-05-21 03:56:55,392 - INFO - Test loss : 0.4740 +2024-05-21 03:56:55,413 - INFO - (980,) +2024-05-21 03:56:55,448 - INFO - Split ID: 0 +2024-05-21 03:56:55,448 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 6.33 +2024-05-21 03:56:55,448 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 13.37 +2024-05-21 03:56:55,448 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 18.98 +2024-05-21 03:56:55,448 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 28.78 +2024-05-21 03:56:55,449 - INFO - +No prior +2024-05-21 03:56:55,450 - INFO - (2262,) +2024-05-21 03:56:55,500 - INFO - Split ID: 0 +2024-05-21 03:56:55,500 - INFO - Top 1 (Epoch 20)acc (%): 70.07 +2024-05-21 03:56:55,500 - INFO - Top 3 (Epoch 20)acc (%): 86.6 +2024-05-21 03:56:55,500 - INFO - Top 5 (Epoch 20)acc (%): 90.05 +2024-05-21 03:56:55,500 - INFO - Top 10 (Epoch 20)acc (%): 92.88 +2024-05-21 03:56:57,790 - INFO - Split ID: 0 +2024-05-21 03:56:57,790 - INFO - Top 1 (Epoch 20)acc (%): 79.8 +2024-05-21 03:56:57,791 - INFO - Top 3 (Epoch 20)acc (%): 91.42 +2024-05-21 03:56:57,791 - INFO - Top 5 (Epoch 20)acc (%): 94.08 +2024-05-21 03:56:57,791 - INFO - Top 10 (Epoch 20)acc (%): 96.2 +2024-05-21 03:56:57,791 - INFO - +Epoch 21 +2024-05-21 03:57:29,942 - INFO - [41984/42490] Loss : 0.6868 +2024-05-21 03:57:30,670 - INFO - Test loss : 0.4940 +2024-05-21 03:57:30,670 - INFO - +Epoch 22 +2024-05-21 03:58:02,891 - INFO - [41984/42490] Loss : 0.6812 +2024-05-21 03:58:03,619 - INFO - Test loss : 0.4890 +2024-05-21 03:58:03,620 - INFO - +Epoch 23 +2024-05-21 03:58:35,614 - INFO - [41984/42490] Loss : 0.6791 +2024-05-21 03:58:36,341 - INFO - Test loss : 0.5154 +2024-05-21 03:58:36,342 - INFO - +Epoch 24 +2024-05-21 03:59:08,320 - INFO - [41984/42490] Loss : 0.6759 +2024-05-21 03:59:09,053 - INFO - Test loss : 0.5390 +2024-05-21 03:59:09,053 - INFO - +Epoch 25 +2024-05-21 03:59:41,286 - INFO - [41984/42490] Loss : 0.6698 +2024-05-21 03:59:42,013 - INFO - Test loss : 0.5408 +2024-05-21 03:59:42,013 - INFO - +Epoch 26 +2024-05-21 04:00:14,145 - INFO - [41984/42490] Loss : 0.6696 +2024-05-21 04:00:14,872 - INFO - Test loss : 0.5031 +2024-05-21 04:00:14,872 - INFO - +Epoch 27 +2024-05-21 04:00:46,835 - INFO - [41984/42490] Loss : 0.6662 +2024-05-21 04:00:47,568 - INFO - Test loss : 0.5485 +2024-05-21 04:00:47,568 - INFO - +Epoch 28 +2024-05-21 04:01:19,576 - INFO - [41984/42490] Loss : 0.6637 +2024-05-21 04:01:20,302 - INFO - Test loss : 0.5532 +2024-05-21 04:01:20,303 - INFO - +Epoch 29 +2024-05-21 04:01:52,405 - INFO - [41984/42490] Loss : 0.6584 +2024-05-21 04:01:53,134 - INFO - Test loss : 0.5604 +2024-05-21 04:01:53,134 - INFO - Saving output model to ../models/space2vec_grid/model_birdsnap_ebird_meta_Space2Vec-grid_inception_v3_0.0100_128_0.1000000_360.000_1_512_leakyrelu.pth.tar +2024-05-21 04:01:53,152 - INFO - Saving output model to ../models/space2vec_grid/model_birdsnap_ebird_meta_Space2Vec-grid_inception_v3_0.0100_128_0.1000000_360.000_1_512_leakyrelu.pth.tar +2024-05-21 04:01:53,202 - INFO - +No prior +2024-05-21 04:01:53,204 - INFO - (2262,) +2024-05-21 04:01:53,254 - INFO - Split ID: 0 +2024-05-21 04:01:53,255 - INFO - Top 1 acc (%): 70.07 +2024-05-21 04:01:53,255 - INFO - Top 3 acc (%): 86.6 +2024-05-21 04:01:53,255 - INFO - Top 5 acc (%): 90.05 +2024-05-21 04:01:53,255 - INFO - Top 10 acc (%): 92.88 +2024-05-21 04:01:55,483 - INFO - Split ID: 0 +2024-05-21 04:01:55,484 - INFO - Top 1 acc (%): 80.24 +2024-05-21 04:01:55,484 - INFO - Top 3 acc (%): 91.73 +2024-05-21 04:01:55,484 - INFO - Top 5 acc (%): 93.9 +2024-05-21 04:01:55,484 - INFO - Top 10 acc (%): 96.11 +2024-05-21 04:01:55,484 - INFO - +Space2Vec-grid +2024-05-21 04:01:55,484 - INFO - Model : model_birdsnap_ebird_meta_Space2Vec-grid_inception_v3_0.0100_128_0.1000000_360.000_1_512_leakyrelu.pth.tar +2024-05-21 04:01:55,504 - INFO - (980,) +2024-05-21 04:01:55,539 - INFO - Split ID: 0 +2024-05-21 04:01:55,539 - INFO - Top 1 LocEnc acc (%): 5.92 +2024-05-21 04:01:55,539 - INFO - Top 3 LocEnc acc (%): 13.67 +2024-05-21 04:01:55,539 - INFO - Top 5 LocEnc acc (%): 18.06 +2024-05-21 04:01:55,539 - INFO - Top 10 LocEnc acc (%): 28.78 diff --git a/pre_trained_models/space2vec_grid/model_birdsnap_ebird_meta_Space2Vec-grid_inception_v3_0.0100_128_0.1000000_360.000_1_512_leakyrelu.pth.tar b/pre_trained_models/space2vec_grid/model_birdsnap_ebird_meta_Space2Vec-grid_inception_v3_0.0100_128_0.1000000_360.000_1_512_leakyrelu.pth.tar new file mode 100755 index 00000000..b60279d4 Binary files /dev/null and b/pre_trained_models/space2vec_grid/model_birdsnap_ebird_meta_Space2Vec-grid_inception_v3_0.0100_128_0.1000000_360.000_1_512_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_grid/model_birdsnap_orig_meta_Space2Vec-grid_inception_v3_0.0100_128_0.0500000_360.000_1_256_BATCH4096_leakyrelu.log b/pre_trained_models/space2vec_grid/model_birdsnap_orig_meta_Space2Vec-grid_inception_v3_0.0100_128_0.0500000_360.000_1_256_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..9c3ee592 --- /dev/null +++ b/pre_trained_models/space2vec_grid/model_birdsnap_orig_meta_Space2Vec-grid_inception_v3_0.0100_128_0.0500000_360.000_1_256_BATCH4096_leakyrelu.log @@ -0,0 +1,190 @@ +2024-05-21 08:52:03,418 - INFO - +num_classes 500 +2024-05-21 08:52:03,418 - INFO - num train 19133 +2024-05-21 08:52:03,418 - INFO - num val 443 +2024-05-21 08:52:03,418 - INFO - train loss full_loss +2024-05-21 08:52:03,418 - INFO - model name ../models/space2vec_grid/model_birdsnap_orig_meta_Space2Vec-grid_inception_v3_0.0100_128_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-21 08:52:03,418 - INFO - num users 2872 +2024-05-21 08:52:03,418 - INFO - meta data orig_meta +2024-05-21 08:52:04,230 - INFO - +Epoch 0 +2024-05-21 08:52:23,601 - INFO - [16384/19133] Loss : 1.8201 +2024-05-21 08:52:24,034 - INFO - Test loss : 0.6548 +2024-05-21 08:52:24,034 - INFO - +Epoch 1 +2024-05-21 08:52:42,702 - INFO - [16384/19133] Loss : 1.3179 +2024-05-21 08:52:43,136 - INFO - Test loss : 0.4822 +2024-05-21 08:52:43,137 - INFO - +Epoch 2 +2024-05-21 08:53:01,914 - INFO - [16384/19133] Loss : 1.1810 +2024-05-21 08:53:02,319 - INFO - Test loss : 0.4086 +2024-05-21 08:53:02,319 - INFO - +Epoch 3 +2024-05-21 08:53:21,097 - INFO - [16384/19133] Loss : 1.1004 +2024-05-21 08:53:21,525 - INFO - Test loss : 0.4055 +2024-05-21 08:53:21,525 - INFO - +Epoch 4 +2024-05-21 08:53:40,309 - INFO - [16384/19133] Loss : 1.0489 +2024-05-21 08:53:40,744 - INFO - Test loss : 0.4131 +2024-05-21 08:53:40,744 - INFO - +Epoch 5 +2024-05-21 08:53:59,523 - INFO - [16384/19133] Loss : 1.0036 +2024-05-21 08:53:59,958 - INFO - Test loss : 0.4619 +2024-05-21 08:53:59,958 - INFO - +Epoch 6 +2024-05-21 08:54:18,698 - INFO - [16384/19133] Loss : 0.9676 +2024-05-21 08:54:19,136 - INFO - Test loss : 0.4426 +2024-05-21 08:54:19,136 - INFO - +Epoch 7 +2024-05-21 08:54:37,889 - INFO - [16384/19133] Loss : 0.9351 +2024-05-21 08:54:38,324 - INFO - Test loss : 0.5045 +2024-05-21 08:54:38,324 - INFO - +Epoch 8 +2024-05-21 08:54:57,104 - INFO - [16384/19133] Loss : 0.9131 +2024-05-21 08:54:57,539 - INFO - Test loss : 0.4663 +2024-05-21 08:54:57,539 - INFO - +Epoch 9 +2024-05-21 08:55:16,180 - INFO - [16384/19133] Loss : 0.8858 +2024-05-21 08:55:16,615 - INFO - Test loss : 0.4803 +2024-05-21 08:55:16,616 - INFO - +Epoch 10 +2024-05-21 08:55:35,391 - INFO - [16384/19133] Loss : 0.8711 +2024-05-21 08:55:35,826 - INFO - Test loss : 0.5086 +2024-05-21 08:55:35,840 - INFO - (443,) +2024-05-21 08:55:35,857 - INFO - Split ID: 0 +2024-05-21 08:55:35,857 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 4.97 +2024-05-21 08:55:35,857 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 9.93 +2024-05-21 08:55:35,858 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 13.54 +2024-05-21 08:55:35,858 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 20.99 +2024-05-21 08:55:35,858 - INFO - +No prior +2024-05-21 08:55:35,859 - INFO - (2262,) +2024-05-21 08:55:35,911 - INFO - Split ID: 0 +2024-05-21 08:55:35,911 - INFO - Top 1 (Epoch 10)acc (%): 70.07 +2024-05-21 08:55:35,911 - INFO - Top 3 (Epoch 10)acc (%): 86.6 +2024-05-21 08:55:35,911 - INFO - Top 5 (Epoch 10)acc (%): 90.05 +2024-05-21 08:55:35,911 - INFO - Top 10 (Epoch 10)acc (%): 92.88 +2024-05-21 08:55:37,434 - INFO - Split ID: 0 +2024-05-21 08:55:37,435 - INFO - Top 1 (Epoch 10)acc (%): 71.79 +2024-05-21 08:55:37,435 - INFO - Top 3 (Epoch 10)acc (%): 86.83 +2024-05-21 08:55:37,435 - INFO - Top 5 (Epoch 10)acc (%): 90.14 +2024-05-21 08:55:37,435 - INFO - Top 10 (Epoch 10)acc (%): 93.28 +2024-05-21 08:55:37,435 - INFO - +Epoch 11 +2024-05-21 08:55:56,211 - INFO - [16384/19133] Loss : 0.8557 +2024-05-21 08:55:56,640 - INFO - Test loss : 0.5156 +2024-05-21 08:55:56,640 - INFO - +Epoch 12 +2024-05-21 08:56:15,297 - INFO - [16384/19133] Loss : 0.8395 +2024-05-21 08:56:15,731 - INFO - Test loss : 0.5095 +2024-05-21 08:56:15,731 - INFO - +Epoch 13 +2024-05-21 08:56:34,873 - INFO - [16384/19133] Loss : 0.8246 +2024-05-21 08:56:35,311 - INFO - Test loss : 0.5387 +2024-05-21 08:56:35,311 - INFO - +Epoch 14 +2024-05-21 08:56:54,109 - INFO - [16384/19133] Loss : 0.8123 +2024-05-21 08:56:54,544 - INFO - Test loss : 0.5443 +2024-05-21 08:56:54,544 - INFO - +Epoch 15 +2024-05-21 08:57:13,340 - INFO - [16384/19133] Loss : 0.7998 +2024-05-21 08:57:13,774 - INFO - Test loss : 0.5673 +2024-05-21 08:57:13,774 - INFO - +Epoch 16 +2024-05-21 08:57:32,486 - INFO - [16384/19133] Loss : 0.7858 +2024-05-21 08:57:32,896 - INFO - Test loss : 0.6042 +2024-05-21 08:57:32,896 - INFO - +Epoch 17 +2024-05-21 08:57:51,668 - INFO - [16384/19133] Loss : 0.7756 +2024-05-21 08:57:52,101 - INFO - Test loss : 0.6009 +2024-05-21 08:57:52,102 - INFO - +Epoch 18 +2024-05-21 08:58:10,864 - INFO - [16384/19133] Loss : 0.7695 +2024-05-21 08:58:11,302 - INFO - Test loss : 0.5981 +2024-05-21 08:58:11,302 - INFO - +Epoch 19 +2024-05-21 08:58:30,088 - INFO - [16384/19133] Loss : 0.7596 +2024-05-21 08:58:30,498 - INFO - Test loss : 0.6400 +2024-05-21 08:58:30,498 - INFO - +Epoch 20 +2024-05-21 08:58:49,283 - INFO - [16384/19133] Loss : 0.7498 +2024-05-21 08:58:49,718 - INFO - Test loss : 0.6468 +2024-05-21 08:58:49,725 - INFO - (443,) +2024-05-21 08:58:49,742 - INFO - Split ID: 0 +2024-05-21 08:58:49,742 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 4.74 +2024-05-21 08:58:49,742 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 9.93 +2024-05-21 08:58:49,743 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 15.58 +2024-05-21 08:58:49,743 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 23.48 +2024-05-21 08:58:49,743 - INFO - +No prior +2024-05-21 08:58:49,744 - INFO - (2262,) +2024-05-21 08:58:49,796 - INFO - Split ID: 0 +2024-05-21 08:58:49,796 - INFO - Top 1 (Epoch 20)acc (%): 70.07 +2024-05-21 08:58:49,796 - INFO - Top 3 (Epoch 20)acc (%): 86.6 +2024-05-21 08:58:49,796 - INFO - Top 5 (Epoch 20)acc (%): 90.05 +2024-05-21 08:58:49,796 - INFO - Top 10 (Epoch 20)acc (%): 92.88 +2024-05-21 08:58:51,305 - INFO - Split ID: 0 +2024-05-21 08:58:51,305 - INFO - Top 1 (Epoch 20)acc (%): 71.79 +2024-05-21 08:58:51,307 - INFO - Top 3 (Epoch 20)acc (%): 86.65 +2024-05-21 08:58:51,307 - INFO - Top 5 (Epoch 20)acc (%): 90.1 +2024-05-21 08:58:51,307 - INFO - Top 10 (Epoch 20)acc (%): 93.06 +2024-05-21 08:58:51,307 - INFO - +Epoch 21 +2024-05-21 08:59:10,080 - INFO - [16384/19133] Loss : 0.7448 +2024-05-21 08:59:10,521 - INFO - Test loss : 0.6527 +2024-05-21 08:59:10,521 - INFO - +Epoch 22 +2024-05-21 08:59:29,275 - INFO - [16384/19133] Loss : 0.7395 +2024-05-21 08:59:29,710 - INFO - Test loss : 0.6764 +2024-05-21 08:59:29,711 - INFO - +Epoch 23 +2024-05-21 08:59:48,476 - INFO - [16384/19133] Loss : 0.7266 +2024-05-21 08:59:48,911 - INFO - Test loss : 0.6794 +2024-05-21 08:59:48,911 - INFO - +Epoch 24 +2024-05-21 09:00:07,583 - INFO - [16384/19133] Loss : 0.7181 +2024-05-21 09:00:08,018 - INFO - Test loss : 0.7127 +2024-05-21 09:00:08,018 - INFO - +Epoch 25 +2024-05-21 09:00:26,554 - INFO - [16384/19133] Loss : 0.7194 +2024-05-21 09:00:26,993 - INFO - Test loss : 0.7361 +2024-05-21 09:00:26,993 - INFO - +Epoch 26 +2024-05-21 09:00:46,047 - INFO - [16384/19133] Loss : 0.7046 +2024-05-21 09:00:46,486 - INFO - Test loss : 0.7101 +2024-05-21 09:00:46,486 - INFO - +Epoch 27 +2024-05-21 09:01:05,444 - INFO - [16384/19133] Loss : 0.7028 +2024-05-21 09:01:05,879 - INFO - Test loss : 0.7409 +2024-05-21 09:01:05,879 - INFO - +Epoch 28 +2024-05-21 09:01:24,609 - INFO - [16384/19133] Loss : 0.7032 +2024-05-21 09:01:25,041 - INFO - Test loss : 0.7146 +2024-05-21 09:01:25,041 - INFO - +Epoch 29 +2024-05-21 09:01:43,824 - INFO - [16384/19133] Loss : 0.6919 +2024-05-21 09:01:44,259 - INFO - Test loss : 0.7455 +2024-05-21 09:01:44,259 - INFO - Saving output model to ../models/space2vec_grid/model_birdsnap_orig_meta_Space2Vec-grid_inception_v3_0.0100_128_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-21 09:01:44,270 - INFO - Saving output model to ../models/space2vec_grid/model_birdsnap_orig_meta_Space2Vec-grid_inception_v3_0.0100_128_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-21 09:01:44,294 - INFO - +No prior +2024-05-21 09:01:44,296 - INFO - (2262,) +2024-05-21 09:01:44,347 - INFO - Split ID: 0 +2024-05-21 09:01:44,348 - INFO - Top 1 acc (%): 70.07 +2024-05-21 09:01:44,348 - INFO - Top 3 acc (%): 86.6 +2024-05-21 09:01:44,348 - INFO - Top 5 acc (%): 90.05 +2024-05-21 09:01:44,348 - INFO - Top 10 acc (%): 92.88 +2024-05-21 09:01:45,869 - INFO - Split ID: 0 +2024-05-21 09:01:45,869 - INFO - Top 1 acc (%): 71.75 +2024-05-21 09:01:45,869 - INFO - Top 3 acc (%): 86.65 +2024-05-21 09:01:45,869 - INFO - Top 5 acc (%): 90.01 +2024-05-21 09:01:45,869 - INFO - Top 10 acc (%): 93.1 +2024-05-21 09:01:45,869 - INFO - +Space2Vec-grid +2024-05-21 09:01:45,869 - INFO - Model : model_birdsnap_orig_meta_Space2Vec-grid_inception_v3_0.0100_128_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-21 09:01:45,886 - INFO - (443,) +2024-05-21 09:01:45,902 - INFO - Split ID: 0 +2024-05-21 09:01:45,902 - INFO - Top 1 LocEnc acc (%): 5.42 +2024-05-21 09:01:45,902 - INFO - Top 3 LocEnc acc (%): 11.96 +2024-05-21 09:01:45,902 - INFO - Top 5 LocEnc acc (%): 16.7 +2024-05-21 09:01:45,902 - INFO - Top 10 LocEnc acc (%): 26.19 diff --git a/pre_trained_models/space2vec_grid/model_birdsnap_orig_meta_Space2Vec-grid_inception_v3_0.0100_128_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/space2vec_grid/model_birdsnap_orig_meta_Space2Vec-grid_inception_v3_0.0100_128_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..6d6ebacf Binary files /dev/null and b/pre_trained_models/space2vec_grid/model_birdsnap_orig_meta_Space2Vec-grid_inception_v3_0.0100_128_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.log b/pre_trained_models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.log new file mode 100755 index 00000000..c074e68f --- /dev/null +++ b/pre_trained_models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.log @@ -0,0 +1,1357 @@ +2024-05-21 15:18:57,260 - INFO - +num_classes 62 +2024-05-21 15:18:57,260 - INFO - num train 363571 +2024-05-21 15:18:57,260 - INFO - num val 53041 +2024-05-21 15:18:57,260 - INFO - train loss full_loss +2024-05-21 15:18:57,260 - INFO - model name ../models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-21 15:18:57,260 - INFO - num users 1 +2024-05-21 15:18:58,119 - INFO - +Epoch 0 +2024-05-21 15:19:03,277 - INFO - [0/363571] Loss : 2.2591 +2024-05-21 15:19:41,584 - INFO - Test loss : 1.0955 +2024-05-21 15:19:41,585 - INFO - +Epoch 1 +2024-05-21 15:19:46,261 - INFO - [0/363571] Loss : 1.8658 +2024-05-21 15:20:25,076 - INFO - Test loss : 0.5256 +2024-05-21 15:20:25,076 - INFO - +Epoch 2 +2024-05-21 15:20:29,724 - INFO - [0/363571] Loss : 1.7806 +2024-05-21 15:21:08,644 - INFO - Test loss : 2.6064 +2024-05-21 15:21:08,644 - INFO - +Epoch 3 +2024-05-21 15:21:13,209 - INFO - [0/363571] Loss : 2.4495 +2024-05-21 15:21:53,083 - INFO - Test loss : 0.5634 +2024-05-21 15:21:53,083 - INFO - +Epoch 4 +2024-05-21 15:21:57,691 - INFO - [0/363571] Loss : 1.6259 +2024-05-21 15:22:36,528 - INFO - Test loss : 0.2942 +2024-05-21 15:22:36,528 - INFO - +Epoch 5 +2024-05-21 15:22:41,173 - INFO - [0/363571] Loss : 1.8798 +2024-05-21 15:23:20,072 - INFO - Test loss : 0.3882 +2024-05-21 15:23:20,073 - INFO - +Epoch 6 +2024-05-21 15:23:24,735 - INFO - [0/363571] Loss : 1.7174 +2024-05-21 15:24:02,238 - INFO - Test loss : 0.6655 +2024-05-21 15:24:02,238 - INFO - +Epoch 7 +2024-05-21 15:24:05,329 - INFO - [0/363571] Loss : 1.5393 +2024-05-21 15:24:44,404 - INFO - Test loss : 1.0523 +2024-05-21 15:24:44,404 - INFO - +Epoch 8 +2024-05-21 15:24:49,180 - INFO - [0/363571] Loss : 1.5629 +2024-05-21 15:25:28,509 - INFO - Test loss : 1.1283 +2024-05-21 15:25:28,509 - INFO - +Epoch 9 +2024-05-21 15:25:33,201 - INFO - [0/363571] Loss : 1.5586 +2024-05-21 15:26:12,471 - INFO - Test loss : 0.8804 +2024-05-21 15:26:12,471 - INFO - +Epoch 10 +2024-05-21 15:26:17,247 - INFO - [0/363571] Loss : 1.4915 +2024-05-21 15:26:56,579 - INFO - Test loss : 0.6378 +2024-05-21 15:26:57,214 - INFO - (53041,) +2024-05-21 15:26:57,368 - INFO - Split ID: 0 +2024-05-21 15:26:57,387 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 17.98 +2024-05-21 15:26:57,389 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 29.06 +2024-05-21 15:26:57,391 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 36.46 +2024-05-21 15:26:57,393 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 49.08 +2024-05-21 15:26:57,399 - INFO - +No prior +2024-05-21 15:26:57,404 - INFO - (53041,) +2024-05-21 15:26:57,558 - INFO - Split ID: 0 +2024-05-21 15:26:57,559 - INFO - Top 1 (Epoch 10)acc (%): 69.83 +2024-05-21 15:26:57,559 - INFO - Top 3 (Epoch 10)acc (%): 84.61 +2024-05-21 15:26:57,559 - INFO - Top 5 (Epoch 10)acc (%): 89.23 +2024-05-21 15:26:57,559 - INFO - Top 10 (Epoch 10)acc (%): 94.29 +2024-05-21 15:27:37,730 - INFO - Split ID: 0 +2024-05-21 15:27:37,730 - INFO - Top 1 (Epoch 10)acc (%): 70.27 +2024-05-21 15:27:37,730 - INFO - Top 3 (Epoch 10)acc (%): 85.0 +2024-05-21 15:27:37,731 - INFO - Top 5 (Epoch 10)acc (%): 89.6 +2024-05-21 15:27:37,731 - INFO - Top 10 (Epoch 10)acc (%): 94.51 +2024-05-21 15:27:37,731 - INFO - +Epoch 11 +2024-05-21 15:27:42,389 - INFO - [0/363571] Loss : 1.4817 +2024-05-21 15:28:21,723 - INFO - Test loss : 0.5646 +2024-05-21 15:28:21,723 - INFO - +Epoch 12 +2024-05-21 15:28:26,405 - INFO - [0/363571] Loss : 1.4819 +2024-05-21 15:29:05,795 - INFO - Test loss : 0.6300 +2024-05-21 15:29:05,795 - INFO - +Epoch 13 +2024-05-21 15:29:10,430 - INFO - [0/363571] Loss : 1.4468 +2024-05-21 15:29:49,726 - INFO - Test loss : 0.7556 +2024-05-21 15:29:49,726 - INFO - +Epoch 14 +2024-05-21 15:29:54,519 - INFO - [0/363571] Loss : 1.4109 +2024-05-21 15:30:34,528 - INFO - Test loss : 0.8544 +2024-05-21 15:30:34,529 - INFO - +Epoch 15 +2024-05-21 15:30:39,280 - INFO - [0/363571] Loss : 1.4009 +2024-05-21 15:31:18,537 - INFO - Test loss : 0.8636 +2024-05-21 15:31:18,537 - INFO - +Epoch 16 +2024-05-21 15:31:23,176 - INFO - [0/363571] Loss : 1.4028 +2024-05-21 15:32:02,345 - INFO - Test loss : 0.7883 +2024-05-21 15:32:02,345 - INFO - +Epoch 17 +2024-05-21 15:32:07,067 - INFO - [0/363571] Loss : 1.3725 +2024-05-21 15:32:46,360 - INFO - Test loss : 0.6961 +2024-05-21 15:32:46,360 - INFO - +Epoch 18 +2024-05-21 15:32:51,041 - INFO - [0/363571] Loss : 1.3628 +2024-05-21 15:33:30,317 - INFO - Test loss : 0.6309 +2024-05-21 15:33:30,317 - INFO - +Epoch 19 +2024-05-21 15:33:34,992 - INFO - [0/363571] Loss : 1.3497 +2024-05-21 15:34:14,421 - INFO - Test loss : 0.6064 +2024-05-21 15:34:14,421 - INFO - +Epoch 20 +2024-05-21 15:34:19,051 - INFO - [0/363571] Loss : 1.3543 +2024-05-21 15:34:58,449 - INFO - Test loss : 0.6167 +2024-05-21 15:34:59,069 - INFO - (53041,) +2024-05-21 15:34:59,223 - INFO - Split ID: 0 +2024-05-21 15:34:59,245 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 17.12 +2024-05-21 15:34:59,247 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 30.5 +2024-05-21 15:34:59,248 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 37.86 +2024-05-21 15:34:59,250 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 51.84 +2024-05-21 15:34:59,256 - INFO - +No prior +2024-05-21 15:34:59,261 - INFO - (53041,) +2024-05-21 15:34:59,414 - INFO - Split ID: 0 +2024-05-21 15:34:59,415 - INFO - Top 1 (Epoch 20)acc (%): 69.83 +2024-05-21 15:34:59,415 - INFO - Top 3 (Epoch 20)acc (%): 84.61 +2024-05-21 15:34:59,415 - INFO - Top 5 (Epoch 20)acc (%): 89.23 +2024-05-21 15:34:59,415 - INFO - Top 10 (Epoch 20)acc (%): 94.29 +2024-05-21 15:35:39,610 - INFO - Split ID: 0 +2024-05-21 15:35:39,611 - INFO - Top 1 (Epoch 20)acc (%): 70.36 +2024-05-21 15:35:39,611 - INFO - Top 3 (Epoch 20)acc (%): 85.07 +2024-05-21 15:35:39,611 - INFO - Top 5 (Epoch 20)acc (%): 89.66 +2024-05-21 15:35:39,611 - INFO - Top 10 (Epoch 20)acc (%): 94.54 +2024-05-21 15:35:39,611 - INFO - +Epoch 21 +2024-05-21 15:35:44,310 - INFO - [0/363571] Loss : 1.3399 +2024-05-21 15:36:23,662 - INFO - Test loss : 0.6500 +2024-05-21 15:36:23,662 - INFO - +Epoch 22 +2024-05-21 15:36:28,391 - INFO - [0/363571] Loss : 1.3226 +2024-05-21 15:37:07,687 - INFO - Test loss : 0.6921 +2024-05-21 15:37:07,687 - INFO - +Epoch 23 +2024-05-21 15:37:12,356 - INFO - [0/363571] Loss : 1.3164 +2024-05-21 15:37:51,596 - INFO - Test loss : 0.7218 +2024-05-21 15:37:51,596 - INFO - +Epoch 24 +2024-05-21 15:37:56,351 - INFO - [0/363571] Loss : 1.3133 +2024-05-21 15:38:35,593 - INFO - Test loss : 0.7230 +2024-05-21 15:38:35,594 - INFO - +Epoch 25 +2024-05-21 15:38:40,299 - INFO - [0/363571] Loss : 1.3100 +2024-05-21 15:39:20,446 - INFO - Test loss : 0.6973 +2024-05-21 15:39:20,447 - INFO - +Epoch 26 +2024-05-21 15:39:24,690 - INFO - [0/363571] Loss : 1.3078 +2024-05-21 15:40:04,007 - INFO - Test loss : 0.6627 +2024-05-21 15:40:04,008 - INFO - +Epoch 27 +2024-05-21 15:40:08,693 - INFO - [0/363571] Loss : 1.3030 +2024-05-21 15:40:44,676 - INFO - Test loss : 0.6344 +2024-05-21 15:40:44,676 - INFO - +Epoch 28 +2024-05-21 15:40:49,332 - INFO - [0/363571] Loss : 1.2763 +2024-05-21 15:41:28,516 - INFO - Test loss : 0.6209 +2024-05-21 15:41:28,516 - INFO - +Epoch 29 +2024-05-21 15:41:33,199 - INFO - [0/363571] Loss : 1.2909 +2024-05-21 15:42:12,264 - INFO - Test loss : 0.6215 +2024-05-21 15:42:12,264 - INFO - +Epoch 30 +2024-05-21 15:42:16,892 - INFO - [0/363571] Loss : 1.2700 +2024-05-21 15:42:56,001 - INFO - Test loss : 0.6338 +2024-05-21 15:42:56,546 - INFO - (53041,) +2024-05-21 15:42:56,699 - INFO - Split ID: 0 +2024-05-21 15:42:56,718 - INFO - Top 1 LocEnc (Epoch 30)acc (%): 17.61 +2024-05-21 15:42:56,719 - INFO - Top 3 LocEnc (Epoch 30)acc (%): 30.42 +2024-05-21 15:42:56,721 - INFO - Top 5 LocEnc (Epoch 30)acc (%): 38.64 +2024-05-21 15:42:56,723 - INFO - Top 10 LocEnc (Epoch 30)acc (%): 52.01 +2024-05-21 15:42:56,729 - INFO - +No prior +2024-05-21 15:42:56,734 - INFO - (53041,) +2024-05-21 15:42:56,886 - INFO - Split ID: 0 +2024-05-21 15:42:56,886 - INFO - Top 1 (Epoch 30)acc (%): 69.83 +2024-05-21 15:42:56,886 - INFO - Top 3 (Epoch 30)acc (%): 84.61 +2024-05-21 15:42:56,887 - INFO - Top 5 (Epoch 30)acc (%): 89.23 +2024-05-21 15:42:56,887 - INFO - Top 10 (Epoch 30)acc (%): 94.29 +2024-05-21 15:43:37,039 - INFO - Split ID: 0 +2024-05-21 15:43:37,040 - INFO - Top 1 (Epoch 30)acc (%): 70.48 +2024-05-21 15:43:37,040 - INFO - Top 3 (Epoch 30)acc (%): 85.2 +2024-05-21 15:43:37,040 - INFO - Top 5 (Epoch 30)acc (%): 89.76 +2024-05-21 15:43:37,040 - INFO - Top 10 (Epoch 30)acc (%): 94.61 +2024-05-21 15:43:37,040 - INFO - +Epoch 31 +2024-05-21 15:43:41,786 - INFO - [0/363571] Loss : 1.2742 +2024-05-21 15:44:20,894 - INFO - Test loss : 0.6509 +2024-05-21 15:44:20,895 - INFO - +Epoch 32 +2024-05-21 15:44:25,515 - INFO - [0/363571] Loss : 1.2835 +2024-05-21 15:45:04,648 - INFO - Test loss : 0.6662 +2024-05-21 15:45:04,648 - INFO - +Epoch 33 +2024-05-21 15:45:09,395 - INFO - [0/363571] Loss : 1.2579 +2024-05-21 15:45:48,505 - INFO - Test loss : 0.6717 +2024-05-21 15:45:48,505 - INFO - +Epoch 34 +2024-05-21 15:45:53,162 - INFO - [0/363571] Loss : 1.2605 +2024-05-21 15:46:32,273 - INFO - Test loss : 0.6630 +2024-05-21 15:46:32,273 - INFO - +Epoch 35 +2024-05-21 15:46:36,914 - INFO - [0/363571] Loss : 1.2513 +2024-05-21 15:47:16,258 - INFO - Test loss : 0.6485 +2024-05-21 15:47:16,258 - INFO - +Epoch 36 +2024-05-21 15:47:21,266 - INFO - [0/363571] Loss : 1.2442 +2024-05-21 15:48:00,808 - INFO - Test loss : 0.6320 +2024-05-21 15:48:00,808 - INFO - +Epoch 37 +2024-05-21 15:48:05,489 - INFO - [0/363571] Loss : 1.2475 +2024-05-21 15:48:44,444 - INFO - Test loss : 0.6186 +2024-05-21 15:48:44,444 - INFO - +Epoch 38 +2024-05-21 15:48:49,177 - INFO - [0/363571] Loss : 1.2388 +2024-05-21 15:49:28,241 - INFO - Test loss : 0.6134 +2024-05-21 15:49:28,241 - INFO - +Epoch 39 +2024-05-21 15:49:32,829 - INFO - [0/363571] Loss : 1.2341 +2024-05-21 15:50:11,928 - INFO - Test loss : 0.6158 +2024-05-21 15:50:11,928 - INFO - +Epoch 40 +2024-05-21 15:50:16,662 - INFO - [0/363571] Loss : 1.2379 +2024-05-21 15:50:55,750 - INFO - Test loss : 0.6238 +2024-05-21 15:50:56,344 - INFO - (53041,) +2024-05-21 15:50:56,494 - INFO - Split ID: 0 +2024-05-21 15:50:56,513 - INFO - Top 1 LocEnc (Epoch 40)acc (%): 18.42 +2024-05-21 15:50:56,515 - INFO - Top 3 LocEnc (Epoch 40)acc (%): 31.72 +2024-05-21 15:50:56,516 - INFO - Top 5 LocEnc (Epoch 40)acc (%): 39.63 +2024-05-21 15:50:56,518 - INFO - Top 10 LocEnc (Epoch 40)acc (%): 53.61 +2024-05-21 15:50:56,524 - INFO - +No prior +2024-05-21 15:50:56,529 - INFO - (53041,) +2024-05-21 15:50:56,685 - INFO - Split ID: 0 +2024-05-21 15:50:56,685 - INFO - Top 1 (Epoch 40)acc (%): 69.83 +2024-05-21 15:50:56,685 - INFO - Top 3 (Epoch 40)acc (%): 84.61 +2024-05-21 15:50:56,685 - INFO - Top 5 (Epoch 40)acc (%): 89.23 +2024-05-21 15:50:56,685 - INFO - Top 10 (Epoch 40)acc (%): 94.29 +2024-05-21 15:51:36,843 - INFO - Split ID: 0 +2024-05-21 15:51:36,843 - INFO - Top 1 (Epoch 40)acc (%): 70.51 +2024-05-21 15:51:36,844 - INFO - Top 3 (Epoch 40)acc (%): 85.2 +2024-05-21 15:51:36,844 - INFO - Top 5 (Epoch 40)acc (%): 89.8 +2024-05-21 15:51:36,844 - INFO - Top 10 (Epoch 40)acc (%): 94.65 +2024-05-21 15:51:36,844 - INFO - +Epoch 41 +2024-05-21 15:51:41,517 - INFO - [0/363571] Loss : 1.2438 +2024-05-21 15:52:20,605 - INFO - Test loss : 0.6318 +2024-05-21 15:52:20,605 - INFO - +Epoch 42 +2024-05-21 15:52:25,382 - INFO - [0/363571] Loss : 1.2340 +2024-05-21 15:53:04,574 - INFO - Test loss : 0.6372 +2024-05-21 15:53:04,574 - INFO - +Epoch 43 +2024-05-21 15:53:09,210 - INFO - [0/363571] Loss : 1.2271 +2024-05-21 15:53:48,468 - INFO - Test loss : 0.6377 +2024-05-21 15:53:48,468 - INFO - +Epoch 44 +2024-05-21 15:53:53,110 - INFO - [0/363571] Loss : 1.2239 +2024-05-21 15:54:32,598 - INFO - Test loss : 0.6342 +2024-05-21 15:54:32,598 - INFO - +Epoch 45 +2024-05-21 15:54:37,319 - INFO - [0/363571] Loss : 1.2268 +2024-05-21 15:55:16,583 - INFO - Test loss : 0.6263 +2024-05-21 15:55:16,583 - INFO - +Epoch 46 +2024-05-21 15:55:21,223 - INFO - [0/363571] Loss : 1.2250 +2024-05-21 15:56:01,266 - INFO - Test loss : 0.6164 +2024-05-21 15:56:01,266 - INFO - +Epoch 47 +2024-05-21 15:56:05,987 - INFO - [0/363571] Loss : 1.2202 +2024-05-21 15:56:45,053 - INFO - Test loss : 0.6077 +2024-05-21 15:56:45,054 - INFO - +Epoch 48 +2024-05-21 15:56:48,978 - INFO - [0/363571] Loss : 1.2196 +2024-05-21 15:57:26,031 - INFO - Test loss : 0.6026 +2024-05-21 15:57:26,031 - INFO - +Epoch 49 +2024-05-21 15:57:30,767 - INFO - [0/363571] Loss : 1.2163 +2024-05-21 15:58:09,622 - INFO - Test loss : 0.6014 +2024-05-21 15:58:09,622 - INFO - +Epoch 50 +2024-05-21 15:58:14,180 - INFO - [0/363571] Loss : 1.2009 +2024-05-21 15:58:53,103 - INFO - Test loss : 0.6062 +2024-05-21 15:58:53,710 - INFO - (53041,) +2024-05-21 15:58:53,859 - INFO - Split ID: 0 +2024-05-21 15:58:53,878 - INFO - Top 1 LocEnc (Epoch 50)acc (%): 18.66 +2024-05-21 15:58:53,880 - INFO - Top 3 LocEnc (Epoch 50)acc (%): 32.7 +2024-05-21 15:58:53,881 - INFO - Top 5 LocEnc (Epoch 50)acc (%): 40.56 +2024-05-21 15:58:53,883 - INFO - Top 10 LocEnc (Epoch 50)acc (%): 55.26 +2024-05-21 15:58:53,889 - INFO - +No prior +2024-05-21 15:58:53,894 - INFO - (53041,) +2024-05-21 15:58:54,048 - INFO - Split ID: 0 +2024-05-21 15:58:54,049 - INFO - Top 1 (Epoch 50)acc (%): 69.83 +2024-05-21 15:58:54,049 - INFO - Top 3 (Epoch 50)acc (%): 84.61 +2024-05-21 15:58:54,049 - INFO - Top 5 (Epoch 50)acc (%): 89.23 +2024-05-21 15:58:54,049 - INFO - Top 10 (Epoch 50)acc (%): 94.29 +2024-05-21 15:59:34,164 - INFO - Split ID: 0 +2024-05-21 15:59:34,183 - INFO - Top 1 (Epoch 50)acc (%): 70.52 +2024-05-21 15:59:34,183 - INFO - Top 3 (Epoch 50)acc (%): 85.26 +2024-05-21 15:59:34,184 - INFO - Top 5 (Epoch 50)acc (%): 89.83 +2024-05-21 15:59:34,184 - INFO - Top 10 (Epoch 50)acc (%): 94.7 +2024-05-21 15:59:34,184 - INFO - +Epoch 51 +2024-05-21 15:59:38,814 - INFO - [0/363571] Loss : 1.2043 +2024-05-21 16:00:17,691 - INFO - Test loss : 0.6145 +2024-05-21 16:00:17,691 - INFO - +Epoch 52 +2024-05-21 16:00:22,356 - INFO - [0/363571] Loss : 1.2123 +2024-05-21 16:01:01,262 - INFO - Test loss : 0.6217 +2024-05-21 16:01:01,263 - INFO - +Epoch 53 +2024-05-21 16:01:05,879 - INFO - [0/363571] Loss : 1.1984 +2024-05-21 16:01:44,735 - INFO - Test loss : 0.6267 +2024-05-21 16:01:44,736 - INFO - +Epoch 54 +2024-05-21 16:01:49,469 - INFO - [0/363571] Loss : 1.2070 +2024-05-21 16:02:28,428 - INFO - Test loss : 0.6272 +2024-05-21 16:02:28,428 - INFO - +Epoch 55 +2024-05-21 16:02:33,023 - INFO - [0/363571] Loss : 1.1932 +2024-05-21 16:03:11,854 - INFO - Test loss : 0.6253 +2024-05-21 16:03:11,855 - INFO - +Epoch 56 +2024-05-21 16:03:16,602 - INFO - [0/363571] Loss : 1.1933 +2024-05-21 16:03:53,619 - INFO - Test loss : 0.6197 +2024-05-21 16:03:53,619 - INFO - +Epoch 57 +2024-05-21 16:03:58,186 - INFO - [0/363571] Loss : 1.2033 +2024-05-21 16:04:37,009 - INFO - Test loss : 0.6113 +2024-05-21 16:04:37,009 - INFO - +Epoch 58 +2024-05-21 16:04:41,729 - INFO - [0/363571] Loss : 1.1928 +2024-05-21 16:05:20,579 - INFO - Test loss : 0.6033 +2024-05-21 16:05:20,579 - INFO - +Epoch 59 +2024-05-21 16:05:25,216 - INFO - [0/363571] Loss : 1.1772 +2024-05-21 16:06:04,167 - INFO - Test loss : 0.5981 +2024-05-21 16:06:04,168 - INFO - +Epoch 60 +2024-05-21 16:06:08,754 - INFO - [0/363571] Loss : 1.1827 +2024-05-21 16:06:47,748 - INFO - Test loss : 0.5971 +2024-05-21 16:06:48,374 - INFO - (53041,) +2024-05-21 16:06:48,528 - INFO - Split ID: 0 +2024-05-21 16:06:48,547 - INFO - Top 1 LocEnc (Epoch 60)acc (%): 19.63 +2024-05-21 16:06:48,549 - INFO - Top 3 LocEnc (Epoch 60)acc (%): 32.65 +2024-05-21 16:06:48,551 - INFO - Top 5 LocEnc (Epoch 60)acc (%): 41.08 +2024-05-21 16:06:48,553 - INFO - Top 10 LocEnc (Epoch 60)acc (%): 55.27 +2024-05-21 16:06:48,559 - INFO - +No prior +2024-05-21 16:06:48,564 - INFO - (53041,) +2024-05-21 16:06:48,716 - INFO - Split ID: 0 +2024-05-21 16:06:48,716 - INFO - Top 1 (Epoch 60)acc (%): 69.83 +2024-05-21 16:06:48,717 - INFO - Top 3 (Epoch 60)acc (%): 84.61 +2024-05-21 16:06:48,717 - INFO - Top 5 (Epoch 60)acc (%): 89.23 +2024-05-21 16:06:48,717 - INFO - Top 10 (Epoch 60)acc (%): 94.29 +2024-05-21 16:07:28,821 - INFO - Split ID: 0 +2024-05-21 16:07:28,821 - INFO - Top 1 (Epoch 60)acc (%): 70.57 +2024-05-21 16:07:28,821 - INFO - Top 3 (Epoch 60)acc (%): 85.28 +2024-05-21 16:07:28,821 - INFO - Top 5 (Epoch 60)acc (%): 89.85 +2024-05-21 16:07:28,821 - INFO - Top 10 (Epoch 60)acc (%): 94.69 +2024-05-21 16:07:28,822 - INFO - +Epoch 61 +2024-05-21 16:07:33,434 - INFO - [0/363571] Loss : 1.1876 +2024-05-21 16:08:11,678 - INFO - Test loss : 0.6001 +2024-05-21 16:08:11,678 - INFO - +Epoch 62 +2024-05-21 16:08:15,914 - INFO - [0/363571] Loss : 1.1781 +2024-05-21 16:08:54,498 - INFO - Test loss : 0.6065 +2024-05-21 16:08:54,498 - INFO - +Epoch 63 +2024-05-21 16:08:59,187 - INFO - [0/363571] Loss : 1.1819 +2024-05-21 16:09:37,928 - INFO - Test loss : 0.6119 +2024-05-21 16:09:37,928 - INFO - +Epoch 64 +2024-05-21 16:09:42,662 - INFO - [0/363571] Loss : 1.1753 +2024-05-21 16:10:21,426 - INFO - Test loss : 0.6157 +2024-05-21 16:10:21,426 - INFO - +Epoch 65 +2024-05-21 16:10:26,102 - INFO - [0/363571] Loss : 1.1802 +2024-05-21 16:11:04,961 - INFO - Test loss : 0.6164 +2024-05-21 16:11:04,961 - INFO - +Epoch 66 +2024-05-21 16:11:09,694 - INFO - [0/363571] Loss : 1.1701 +2024-05-21 16:11:48,482 - INFO - Test loss : 0.6152 +2024-05-21 16:11:48,482 - INFO - +Epoch 67 +2024-05-21 16:11:53,088 - INFO - [0/363571] Loss : 1.1740 +2024-05-21 16:12:32,002 - INFO - Test loss : 0.6122 +2024-05-21 16:12:32,002 - INFO - +Epoch 68 +2024-05-21 16:12:36,632 - INFO - [0/363571] Loss : 1.1698 +2024-05-21 16:13:16,363 - INFO - Test loss : 0.6080 +2024-05-21 16:13:16,363 - INFO - +Epoch 69 +2024-05-21 16:13:20,955 - INFO - [0/363571] Loss : 1.1648 +2024-05-21 16:13:48,906 - INFO - Test loss : 0.6044 +2024-05-21 16:13:48,906 - INFO - +Epoch 70 +2024-05-21 16:13:52,715 - INFO - [0/363571] Loss : 1.1730 +2024-05-21 16:14:18,297 - INFO - Test loss : 0.5995 +2024-05-21 16:14:18,898 - INFO - (53041,) +2024-05-21 16:14:19,097 - INFO - Split ID: 0 +2024-05-21 16:14:19,115 - INFO - Top 1 LocEnc (Epoch 70)acc (%): 19.7 +2024-05-21 16:14:19,117 - INFO - Top 3 LocEnc (Epoch 70)acc (%): 33.27 +2024-05-21 16:14:19,119 - INFO - Top 5 LocEnc (Epoch 70)acc (%): 41.35 +2024-05-21 16:14:19,120 - INFO - Top 10 LocEnc (Epoch 70)acc (%): 55.67 +2024-05-21 16:14:19,127 - INFO - +No prior +2024-05-21 16:14:19,132 - INFO - (53041,) +2024-05-21 16:14:19,596 - INFO - Split ID: 0 +2024-05-21 16:14:19,597 - INFO - Top 1 (Epoch 70)acc (%): 69.83 +2024-05-21 16:14:19,597 - INFO - Top 3 (Epoch 70)acc (%): 84.61 +2024-05-21 16:14:19,597 - INFO - Top 5 (Epoch 70)acc (%): 89.23 +2024-05-21 16:14:19,597 - INFO - Top 10 (Epoch 70)acc (%): 94.29 +2024-05-21 16:14:57,346 - INFO - Split ID: 0 +2024-05-21 16:14:57,347 - INFO - Top 1 (Epoch 70)acc (%): 70.62 +2024-05-21 16:14:57,347 - INFO - Top 3 (Epoch 70)acc (%): 85.29 +2024-05-21 16:14:57,347 - INFO - Top 5 (Epoch 70)acc (%): 89.86 +2024-05-21 16:14:57,347 - INFO - Top 10 (Epoch 70)acc (%): 94.72 +2024-05-21 16:14:57,348 - INFO - +Epoch 71 +2024-05-21 16:15:00,437 - INFO - [0/363571] Loss : 1.1678 +2024-05-21 16:15:26,490 - INFO - Test loss : 0.5955 +2024-05-21 16:15:26,490 - INFO - +Epoch 72 +2024-05-21 16:15:29,722 - INFO - [0/363571] Loss : 1.1672 +2024-05-21 16:15:55,820 - INFO - Test loss : 0.5948 +2024-05-21 16:15:55,820 - INFO - +Epoch 73 +2024-05-21 16:15:59,546 - INFO - [0/363571] Loss : 1.1676 +2024-05-21 16:16:25,800 - INFO - Test loss : 0.5977 +2024-05-21 16:16:25,800 - INFO - +Epoch 74 +2024-05-21 16:16:29,407 - INFO - [0/363571] Loss : 1.1622 +2024-05-21 16:16:56,299 - INFO - Test loss : 0.6016 +2024-05-21 16:16:56,299 - INFO - +Epoch 75 +2024-05-21 16:16:59,796 - INFO - [0/363571] Loss : 1.1558 +2024-05-21 16:17:25,705 - INFO - Test loss : 0.6073 +2024-05-21 16:17:25,705 - INFO - +Epoch 76 +2024-05-21 16:17:29,016 - INFO - [0/363571] Loss : 1.1667 +2024-05-21 16:17:56,868 - INFO - Test loss : 0.6121 +2024-05-21 16:17:56,868 - INFO - +Epoch 77 +2024-05-21 16:18:00,355 - INFO - [0/363571] Loss : 1.1659 +2024-05-21 16:18:26,592 - INFO - Test loss : 0.6121 +2024-05-21 16:18:26,592 - INFO - +Epoch 78 +2024-05-21 16:18:29,825 - INFO - [0/363571] Loss : 1.1611 +2024-05-21 16:18:55,040 - INFO - Test loss : 0.6108 +2024-05-21 16:18:55,040 - INFO - +Epoch 79 +2024-05-21 16:18:58,254 - INFO - [0/363571] Loss : 1.1718 +2024-05-21 16:19:24,200 - INFO - Test loss : 0.6075 +2024-05-21 16:19:24,200 - INFO - +Epoch 80 +2024-05-21 16:19:27,307 - INFO - [0/363571] Loss : 1.1423 +2024-05-21 16:19:52,676 - INFO - Test loss : 0.6048 +2024-05-21 16:19:53,260 - INFO - (53041,) +2024-05-21 16:19:53,412 - INFO - Split ID: 0 +2024-05-21 16:19:53,430 - INFO - Top 1 LocEnc (Epoch 80)acc (%): 19.99 +2024-05-21 16:19:53,432 - INFO - Top 3 LocEnc (Epoch 80)acc (%): 33.45 +2024-05-21 16:19:53,434 - INFO - Top 5 LocEnc (Epoch 80)acc (%): 41.57 +2024-05-21 16:19:53,435 - INFO - Top 10 LocEnc (Epoch 80)acc (%): 56.09 +2024-05-21 16:19:53,441 - INFO - +No prior +2024-05-21 16:19:53,446 - INFO - (53041,) +2024-05-21 16:19:53,604 - INFO - Split ID: 0 +2024-05-21 16:19:53,604 - INFO - Top 1 (Epoch 80)acc (%): 69.83 +2024-05-21 16:19:53,604 - INFO - Top 3 (Epoch 80)acc (%): 84.61 +2024-05-21 16:19:53,604 - INFO - Top 5 (Epoch 80)acc (%): 89.23 +2024-05-21 16:19:53,604 - INFO - Top 10 (Epoch 80)acc (%): 94.29 +2024-05-21 16:20:29,449 - INFO - Split ID: 0 +2024-05-21 16:20:29,449 - INFO - Top 1 (Epoch 80)acc (%): 70.63 +2024-05-21 16:20:29,449 - INFO - Top 3 (Epoch 80)acc (%): 85.32 +2024-05-21 16:20:29,449 - INFO - Top 5 (Epoch 80)acc (%): 89.86 +2024-05-21 16:20:29,449 - INFO - Top 10 (Epoch 80)acc (%): 94.74 +2024-05-21 16:20:29,450 - INFO - +Epoch 81 +2024-05-21 16:20:32,719 - INFO - [0/363571] Loss : 1.1505 +2024-05-21 16:20:57,771 - INFO - Test loss : 0.6030 +2024-05-21 16:20:57,771 - INFO - +Epoch 82 +2024-05-21 16:21:00,896 - INFO - [0/363571] Loss : 1.1596 +2024-05-21 16:21:25,688 - INFO - Test loss : 0.6010 +2024-05-21 16:21:25,688 - INFO - +Epoch 83 +2024-05-21 16:21:28,780 - INFO - [0/363571] Loss : 1.1544 +2024-05-21 16:21:54,570 - INFO - Test loss : 0.6007 +2024-05-21 16:21:54,570 - INFO - +Epoch 84 +2024-05-21 16:21:57,258 - INFO - [0/363571] Loss : 1.1451 +2024-05-21 16:22:22,656 - INFO - Test loss : 0.6015 +2024-05-21 16:22:22,656 - INFO - +Epoch 85 +2024-05-21 16:22:25,666 - INFO - [0/363571] Loss : 1.1586 +2024-05-21 16:22:50,552 - INFO - Test loss : 0.6024 +2024-05-21 16:22:50,552 - INFO - +Epoch 86 +2024-05-21 16:22:53,770 - INFO - [0/363571] Loss : 1.1509 +2024-05-21 16:23:20,303 - INFO - Test loss : 0.6011 +2024-05-21 16:23:20,303 - INFO - +Epoch 87 +2024-05-21 16:23:23,500 - INFO - [0/363571] Loss : 1.1428 +2024-05-21 16:23:48,557 - INFO - Test loss : 0.6005 +2024-05-21 16:23:48,557 - INFO - +Epoch 88 +2024-05-21 16:23:51,786 - INFO - [0/363571] Loss : 1.1515 +2024-05-21 16:24:17,691 - INFO - Test loss : 0.5981 +2024-05-21 16:24:17,691 - INFO - +Epoch 89 +2024-05-21 16:24:20,828 - INFO - [0/363571] Loss : 1.1479 +2024-05-21 16:24:47,028 - INFO - Test loss : 0.5947 +2024-05-21 16:24:47,028 - INFO - +Epoch 90 +2024-05-21 16:24:50,256 - INFO - [0/363571] Loss : 1.1527 +2024-05-21 16:25:15,166 - INFO - Test loss : 0.5903 +2024-05-21 16:25:15,780 - INFO - (53041,) +2024-05-21 16:25:15,934 - INFO - Split ID: 0 +2024-05-21 16:25:15,954 - INFO - Top 1 LocEnc (Epoch 90)acc (%): 19.75 +2024-05-21 16:25:15,955 - INFO - Top 3 LocEnc (Epoch 90)acc (%): 33.95 +2024-05-21 16:25:15,957 - INFO - Top 5 LocEnc (Epoch 90)acc (%): 42.61 +2024-05-21 16:25:15,959 - INFO - Top 10 LocEnc (Epoch 90)acc (%): 56.83 +2024-05-21 16:25:15,965 - INFO - +No prior +2024-05-21 16:25:15,970 - INFO - (53041,) +2024-05-21 16:25:16,121 - INFO - Split ID: 0 +2024-05-21 16:25:16,122 - INFO - Top 1 (Epoch 90)acc (%): 69.83 +2024-05-21 16:25:16,122 - INFO - Top 3 (Epoch 90)acc (%): 84.61 +2024-05-21 16:25:16,122 - INFO - Top 5 (Epoch 90)acc (%): 89.23 +2024-05-21 16:25:16,122 - INFO - Top 10 (Epoch 90)acc (%): 94.29 +2024-05-21 16:25:53,933 - INFO - Split ID: 0 +2024-05-21 16:25:53,933 - INFO - Top 1 (Epoch 90)acc (%): 70.65 +2024-05-21 16:25:53,934 - INFO - Top 3 (Epoch 90)acc (%): 85.35 +2024-05-21 16:25:53,934 - INFO - Top 5 (Epoch 90)acc (%): 89.86 +2024-05-21 16:25:53,934 - INFO - Top 10 (Epoch 90)acc (%): 94.75 +2024-05-21 16:25:53,934 - INFO - +Epoch 91 +2024-05-21 16:25:56,617 - INFO - [0/363571] Loss : 1.1420 +2024-05-21 16:26:22,481 - INFO - Test loss : 0.5872 +2024-05-21 16:26:22,481 - INFO - +Epoch 92 +2024-05-21 16:26:25,431 - INFO - [0/363571] Loss : 1.1426 +2024-05-21 16:26:52,392 - INFO - Test loss : 0.5861 +2024-05-21 16:26:52,392 - INFO - +Epoch 93 +2024-05-21 16:26:55,509 - INFO - [0/363571] Loss : 1.1390 +2024-05-21 16:27:20,721 - INFO - Test loss : 0.5877 +2024-05-21 16:27:20,721 - INFO - +Epoch 94 +2024-05-21 16:27:24,152 - INFO - [0/363571] Loss : 1.1356 +2024-05-21 16:27:50,767 - INFO - Test loss : 0.5916 +2024-05-21 16:27:50,768 - INFO - +Epoch 95 +2024-05-21 16:27:53,978 - INFO - [0/363571] Loss : 1.1400 +2024-05-21 16:28:19,062 - INFO - Test loss : 0.5966 +2024-05-21 16:28:19,062 - INFO - +Epoch 96 +2024-05-21 16:28:22,171 - INFO - [0/363571] Loss : 1.1311 +2024-05-21 16:28:48,650 - INFO - Test loss : 0.6022 +2024-05-21 16:28:48,650 - INFO - +Epoch 97 +2024-05-21 16:28:51,864 - INFO - [0/363571] Loss : 1.1300 +2024-05-21 16:29:17,948 - INFO - Test loss : 0.6060 +2024-05-21 16:29:17,948 - INFO - +Epoch 98 +2024-05-21 16:29:21,082 - INFO - [0/363571] Loss : 1.1411 +2024-05-21 16:29:46,303 - INFO - Test loss : 0.6075 +2024-05-21 16:29:46,303 - INFO - +Epoch 99 +2024-05-21 16:29:49,408 - INFO - [0/363571] Loss : 1.1275 +2024-05-21 16:30:15,419 - INFO - Test loss : 0.6072 +2024-05-21 16:30:15,419 - INFO - Saving output model to ../models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-21 16:30:15,429 - INFO - Saving output model to ../models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-21 16:30:15,456 - INFO - +No prior +2024-05-21 16:30:15,461 - INFO - (53041,) +2024-05-21 16:30:15,615 - INFO - Split ID: 0 +2024-05-21 16:30:15,615 - INFO - Top 1 acc (%): 69.83 +2024-05-21 16:30:15,615 - INFO - Top 3 acc (%): 84.61 +2024-05-21 16:30:15,615 - INFO - Top 5 acc (%): 89.23 +2024-05-21 16:30:15,615 - INFO - Top 10 acc (%): 94.29 +2024-05-21 16:30:53,475 - INFO - Split ID: 0 +2024-05-21 16:30:53,476 - INFO - Top 1 acc (%): 70.68 +2024-05-21 16:30:53,476 - INFO - Top 3 acc (%): 85.36 +2024-05-21 16:30:53,476 - INFO - Top 5 acc (%): 89.89 +2024-05-21 16:30:53,476 - INFO - Top 10 acc (%): 94.76 +2024-05-21 16:30:53,477 - INFO - +Space2Vec-grid +2024-05-21 16:30:53,477 - INFO - Model : model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-21 16:30:54,066 - INFO - (53041,) +2024-05-21 16:30:54,217 - INFO - Split ID: 0 +2024-05-21 16:30:54,236 - INFO - Top 1 LocEnc acc (%): 19.99 +2024-05-21 16:30:54,238 - INFO - Top 3 LocEnc acc (%): 34.25 +2024-05-21 16:30:54,240 - INFO - Top 5 LocEnc acc (%): 42.86 +2024-05-21 16:30:54,242 - INFO - Top 10 LocEnc acc (%): 56.89 +2024-05-22 02:40:40,336 - INFO - +num_classes 62 +2024-05-22 02:40:40,336 - INFO - num train 363571 +2024-05-22 02:40:40,336 - INFO - num val 53041 +2024-05-22 02:40:40,336 - INFO - train loss full_loss +2024-05-22 02:40:40,336 - INFO - model name ../models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-22 02:40:40,336 - INFO - num users 1 +2024-05-22 02:40:41,304 - INFO - +Epoch 0 +2024-05-22 02:40:42,312 - INFO - [0/363571] Loss : 2.2592 +2024-05-22 02:40:45,331 - INFO - Test loss : 1.1759 +2024-05-22 02:40:45,331 - INFO - +Epoch 1 +2024-05-22 02:40:45,834 - INFO - [0/363571] Loss : 1.8817 +2024-05-22 02:40:48,841 - INFO - Test loss : 0.4632 +2024-05-22 02:40:48,841 - INFO - +Epoch 2 +2024-05-22 02:40:49,266 - INFO - [0/363571] Loss : 1.8692 +2024-05-22 02:40:52,275 - INFO - Test loss : 2.1559 +2024-05-22 02:40:52,275 - INFO - +Epoch 3 +2024-05-22 02:40:52,705 - INFO - [0/363571] Loss : 2.2445 +2024-05-22 02:40:55,792 - INFO - Test loss : 0.6375 +2024-05-22 02:40:55,792 - INFO - +Epoch 4 +2024-05-22 02:40:56,215 - INFO - [0/363571] Loss : 1.6191 +2024-05-22 02:40:59,232 - INFO - Test loss : 0.3706 +2024-05-22 02:40:59,232 - INFO - +Epoch 5 +2024-05-22 02:40:59,655 - INFO - [0/363571] Loss : 1.7602 +2024-05-22 02:41:02,665 - INFO - Test loss : 0.4864 +2024-05-22 02:41:02,665 - INFO - +Epoch 6 +2024-05-22 02:41:03,168 - INFO - [0/363571] Loss : 1.6238 +2024-05-22 02:41:06,178 - INFO - Test loss : 0.8195 +2024-05-22 02:41:06,178 - INFO - +Epoch 7 +2024-05-22 02:41:06,601 - INFO - [0/363571] Loss : 1.5095 +2024-05-22 02:41:09,606 - INFO - Test loss : 1.1357 +2024-05-22 02:41:09,606 - INFO - +Epoch 8 +2024-05-22 02:41:10,109 - INFO - [0/363571] Loss : 1.5534 +2024-05-22 02:41:13,124 - INFO - Test loss : 0.9978 +2024-05-22 02:41:13,124 - INFO - +Epoch 9 +2024-05-22 02:41:13,545 - INFO - [0/363571] Loss : 1.4979 +2024-05-22 02:41:16,553 - INFO - Test loss : 0.7134 +2024-05-22 02:41:16,553 - INFO - +Epoch 10 +2024-05-22 02:41:17,058 - INFO - [0/363571] Loss : 1.4447 +2024-05-22 02:41:20,080 - INFO - Test loss : 0.5485 +2024-05-22 02:41:20,678 - INFO - (53041,) +2024-05-22 02:41:20,831 - INFO - Split ID: 0 +2024-05-22 02:41:20,851 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 17.05 +2024-05-22 02:41:20,853 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 28.91 +2024-05-22 02:41:20,854 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 36.57 +2024-05-22 02:41:20,856 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 49.54 +2024-05-22 02:41:20,862 - INFO - +No prior +2024-05-22 02:41:20,868 - INFO - (53041,) +2024-05-22 02:41:21,023 - INFO - Split ID: 0 +2024-05-22 02:41:21,023 - INFO - Top 1 (Epoch 10)acc (%): 69.83 +2024-05-22 02:41:21,023 - INFO - Top 3 (Epoch 10)acc (%): 84.61 +2024-05-22 02:41:21,023 - INFO - Top 5 (Epoch 10)acc (%): 89.23 +2024-05-22 02:41:21,023 - INFO - Top 10 (Epoch 10)acc (%): 94.29 +2024-05-22 02:41:42,874 - INFO - Split ID: 0 +2024-05-22 02:41:42,875 - INFO - Top 1 (Epoch 10)acc (%): 70.2 +2024-05-22 02:41:42,875 - INFO - Top 3 (Epoch 10)acc (%): 84.93 +2024-05-22 02:41:42,875 - INFO - Top 5 (Epoch 10)acc (%): 89.58 +2024-05-22 02:41:42,875 - INFO - Top 10 (Epoch 10)acc (%): 94.46 +2024-05-22 02:41:42,875 - INFO - +Epoch 11 +2024-05-22 02:41:43,295 - INFO - [0/363571] Loss : 1.4579 +2024-05-22 02:41:46,306 - INFO - Test loss : 0.5206 +2024-05-22 02:41:46,306 - INFO - +Epoch 12 +2024-05-22 02:41:46,730 - INFO - [0/363571] Loss : 1.4542 +2024-05-22 02:41:49,829 - INFO - Test loss : 0.5864 +2024-05-22 02:41:49,829 - INFO - +Epoch 13 +2024-05-22 02:41:50,252 - INFO - [0/363571] Loss : 1.4143 +2024-05-22 02:41:53,250 - INFO - Test loss : 0.6973 +2024-05-22 02:41:53,250 - INFO - +Epoch 14 +2024-05-22 02:41:53,674 - INFO - [0/363571] Loss : 1.3992 +2024-05-22 02:41:56,667 - INFO - Test loss : 0.7967 +2024-05-22 02:41:56,668 - INFO - +Epoch 15 +2024-05-22 02:41:57,172 - INFO - [0/363571] Loss : 1.3795 +2024-05-22 02:42:00,168 - INFO - Test loss : 0.8302 +2024-05-22 02:42:00,169 - INFO - +Epoch 16 +2024-05-22 02:42:00,591 - INFO - [0/363571] Loss : 1.3646 +2024-05-22 02:42:03,586 - INFO - Test loss : 0.7912 +2024-05-22 02:42:03,586 - INFO - +Epoch 17 +2024-05-22 02:42:04,092 - INFO - [0/363571] Loss : 1.3660 +2024-05-22 02:42:07,106 - INFO - Test loss : 0.7142 +2024-05-22 02:42:07,106 - INFO - +Epoch 18 +2024-05-22 02:42:07,532 - INFO - [0/363571] Loss : 1.3386 +2024-05-22 02:42:10,523 - INFO - Test loss : 0.6451 +2024-05-22 02:42:10,523 - INFO - +Epoch 19 +2024-05-22 02:42:10,947 - INFO - [0/363571] Loss : 1.3414 +2024-05-22 02:42:14,040 - INFO - Test loss : 0.6032 +2024-05-22 02:42:14,040 - INFO - +Epoch 20 +2024-05-22 02:42:14,464 - INFO - [0/363571] Loss : 1.3367 +2024-05-22 02:42:17,463 - INFO - Test loss : 0.5888 +2024-05-22 02:42:18,040 - INFO - (53041,) +2024-05-22 02:42:18,195 - INFO - Split ID: 0 +2024-05-22 02:42:18,217 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 17.96 +2024-05-22 02:42:18,218 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 30.98 +2024-05-22 02:42:18,220 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 39.38 +2024-05-22 02:42:18,222 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 52.33 +2024-05-22 02:42:18,228 - INFO - +No prior +2024-05-22 02:42:18,233 - INFO - (53041,) +2024-05-22 02:42:18,383 - INFO - Split ID: 0 +2024-05-22 02:42:18,384 - INFO - Top 1 (Epoch 20)acc (%): 69.83 +2024-05-22 02:42:18,384 - INFO - Top 3 (Epoch 20)acc (%): 84.61 +2024-05-22 02:42:18,384 - INFO - Top 5 (Epoch 20)acc (%): 89.23 +2024-05-22 02:42:18,384 - INFO - Top 10 (Epoch 20)acc (%): 94.29 +2024-05-22 02:42:40,251 - INFO - Split ID: 0 +2024-05-22 02:42:40,251 - INFO - Top 1 (Epoch 20)acc (%): 70.39 +2024-05-22 02:42:40,251 - INFO - Top 3 (Epoch 20)acc (%): 85.07 +2024-05-22 02:42:40,251 - INFO - Top 5 (Epoch 20)acc (%): 89.7 +2024-05-22 02:42:40,251 - INFO - Top 10 (Epoch 20)acc (%): 94.58 +2024-05-22 02:42:40,252 - INFO - +Epoch 21 +2024-05-22 02:42:40,674 - INFO - [0/363571] Loss : 1.3247 +2024-05-22 02:42:43,689 - INFO - Test loss : 0.5997 +2024-05-22 02:42:43,689 - INFO - +Epoch 22 +2024-05-22 02:42:44,193 - INFO - [0/363571] Loss : 1.3133 +2024-05-22 02:42:47,198 - INFO - Test loss : 0.6264 +2024-05-22 02:42:47,198 - INFO - +Epoch 23 +2024-05-22 02:42:47,623 - INFO - [0/363571] Loss : 1.3021 +2024-05-22 02:42:50,648 - INFO - Test loss : 0.6552 +2024-05-22 02:42:50,648 - INFO - +Epoch 24 +2024-05-22 02:42:51,153 - INFO - [0/363571] Loss : 1.2944 +2024-05-22 02:42:54,162 - INFO - Test loss : 0.6714 +2024-05-22 02:42:54,162 - INFO - +Epoch 25 +2024-05-22 02:42:54,584 - INFO - [0/363571] Loss : 1.2825 +2024-05-22 02:42:57,589 - INFO - Test loss : 0.6700 +2024-05-22 02:42:57,589 - INFO - +Epoch 26 +2024-05-22 02:42:58,098 - INFO - [0/363571] Loss : 1.2810 +2024-05-22 02:43:01,103 - INFO - Test loss : 0.6530 +2024-05-22 02:43:01,103 - INFO - +Epoch 27 +2024-05-22 02:43:01,527 - INFO - [0/363571] Loss : 1.2836 +2024-05-22 02:43:04,541 - INFO - Test loss : 0.6271 +2024-05-22 02:43:04,541 - INFO - +Epoch 28 +2024-05-22 02:43:04,965 - INFO - [0/363571] Loss : 1.2706 +2024-05-22 02:43:08,060 - INFO - Test loss : 0.6058 +2024-05-22 02:43:08,060 - INFO - +Epoch 29 +2024-05-22 02:43:08,485 - INFO - [0/363571] Loss : 1.2756 +2024-05-22 02:43:11,484 - INFO - Test loss : 0.5954 +2024-05-22 02:43:11,484 - INFO - +Epoch 30 +2024-05-22 02:43:11,909 - INFO - [0/363571] Loss : 1.2586 +2024-05-22 02:43:14,891 - INFO - Test loss : 0.5983 +2024-05-22 02:43:15,474 - INFO - (53041,) +2024-05-22 02:43:15,625 - INFO - Split ID: 0 +2024-05-22 02:43:15,644 - INFO - Top 1 LocEnc (Epoch 30)acc (%): 17.81 +2024-05-22 02:43:15,645 - INFO - Top 3 LocEnc (Epoch 30)acc (%): 30.94 +2024-05-22 02:43:15,647 - INFO - Top 5 LocEnc (Epoch 30)acc (%): 38.68 +2024-05-22 02:43:15,649 - INFO - Top 10 LocEnc (Epoch 30)acc (%): 52.82 +2024-05-22 02:43:15,655 - INFO - +No prior +2024-05-22 02:43:15,660 - INFO - (53041,) +2024-05-22 02:43:15,808 - INFO - Split ID: 0 +2024-05-22 02:43:15,809 - INFO - Top 1 (Epoch 30)acc (%): 69.83 +2024-05-22 02:43:15,809 - INFO - Top 3 (Epoch 30)acc (%): 84.61 +2024-05-22 02:43:15,809 - INFO - Top 5 (Epoch 30)acc (%): 89.23 +2024-05-22 02:43:15,809 - INFO - Top 10 (Epoch 30)acc (%): 94.29 +2024-05-22 02:43:37,697 - INFO - Split ID: 0 +2024-05-22 02:43:37,697 - INFO - Top 1 (Epoch 30)acc (%): 70.47 +2024-05-22 02:43:37,697 - INFO - Top 3 (Epoch 30)acc (%): 85.17 +2024-05-22 02:43:37,697 - INFO - Top 5 (Epoch 30)acc (%): 89.75 +2024-05-22 02:43:37,698 - INFO - Top 10 (Epoch 30)acc (%): 94.64 +2024-05-22 02:43:37,698 - INFO - +Epoch 31 +2024-05-22 02:43:38,203 - INFO - [0/363571] Loss : 1.2623 +2024-05-22 02:43:41,245 - INFO - Test loss : 0.6092 +2024-05-22 02:43:41,245 - INFO - +Epoch 32 +2024-05-22 02:43:41,673 - INFO - [0/363571] Loss : 1.2541 +2024-05-22 02:43:44,719 - INFO - Test loss : 0.6267 +2024-05-22 02:43:44,719 - INFO - +Epoch 33 +2024-05-22 02:43:45,231 - INFO - [0/363571] Loss : 1.2462 +2024-05-22 02:43:48,267 - INFO - Test loss : 0.6440 +2024-05-22 02:43:48,268 - INFO - +Epoch 34 +2024-05-22 02:43:48,695 - INFO - [0/363571] Loss : 1.2439 +2024-05-22 02:43:51,730 - INFO - Test loss : 0.6517 +2024-05-22 02:43:51,730 - INFO - +Epoch 35 +2024-05-22 02:43:52,156 - INFO - [0/363571] Loss : 1.2448 +2024-05-22 02:43:55,265 - INFO - Test loss : 0.6470 +2024-05-22 02:43:55,265 - INFO - +Epoch 36 +2024-05-22 02:43:55,695 - INFO - [0/363571] Loss : 1.2358 +2024-05-22 02:43:58,733 - INFO - Test loss : 0.6337 +2024-05-22 02:43:58,733 - INFO - +Epoch 37 +2024-05-22 02:43:59,156 - INFO - [0/363571] Loss : 1.2169 +2024-05-22 02:44:02,157 - INFO - Test loss : 0.6169 +2024-05-22 02:44:02,157 - INFO - +Epoch 38 +2024-05-22 02:44:02,662 - INFO - [0/363571] Loss : 1.2394 +2024-05-22 02:44:05,662 - INFO - Test loss : 0.5978 +2024-05-22 02:44:05,662 - INFO - +Epoch 39 +2024-05-22 02:44:06,086 - INFO - [0/363571] Loss : 1.2315 +2024-05-22 02:44:09,097 - INFO - Test loss : 0.5850 +2024-05-22 02:44:09,097 - INFO - +Epoch 40 +2024-05-22 02:44:09,603 - INFO - [0/363571] Loss : 1.2230 +2024-05-22 02:44:12,610 - INFO - Test loss : 0.5798 +2024-05-22 02:44:13,196 - INFO - (53041,) +2024-05-22 02:44:13,348 - INFO - Split ID: 0 +2024-05-22 02:44:13,369 - INFO - Top 1 LocEnc (Epoch 40)acc (%): 19.07 +2024-05-22 02:44:13,371 - INFO - Top 3 LocEnc (Epoch 40)acc (%): 32.41 +2024-05-22 02:44:13,373 - INFO - Top 5 LocEnc (Epoch 40)acc (%): 40.73 +2024-05-22 02:44:13,375 - INFO - Top 10 LocEnc (Epoch 40)acc (%): 55.02 +2024-05-22 02:44:13,380 - INFO - +No prior +2024-05-22 02:44:13,385 - INFO - (53041,) +2024-05-22 02:44:13,538 - INFO - Split ID: 0 +2024-05-22 02:44:13,538 - INFO - Top 1 (Epoch 40)acc (%): 69.83 +2024-05-22 02:44:13,538 - INFO - Top 3 (Epoch 40)acc (%): 84.61 +2024-05-22 02:44:13,539 - INFO - Top 5 (Epoch 40)acc (%): 89.23 +2024-05-22 02:44:13,539 - INFO - Top 10 (Epoch 40)acc (%): 94.29 +2024-05-22 02:44:35,413 - INFO - Split ID: 0 +2024-05-22 02:44:35,413 - INFO - Top 1 (Epoch 40)acc (%): 70.5 +2024-05-22 02:44:35,413 - INFO - Top 3 (Epoch 40)acc (%): 85.24 +2024-05-22 02:44:35,413 - INFO - Top 5 (Epoch 40)acc (%): 89.79 +2024-05-22 02:44:35,413 - INFO - Top 10 (Epoch 40)acc (%): 94.67 +2024-05-22 02:44:35,414 - INFO - +Epoch 41 +2024-05-22 02:44:35,838 - INFO - [0/363571] Loss : 1.2283 +2024-05-22 02:44:38,848 - INFO - Test loss : 0.5828 +2024-05-22 02:44:38,848 - INFO - +Epoch 42 +2024-05-22 02:44:39,358 - INFO - [0/363571] Loss : 1.2105 +2024-05-22 02:44:42,367 - INFO - Test loss : 0.5938 +2024-05-22 02:44:42,367 - INFO - +Epoch 43 +2024-05-22 02:44:42,792 - INFO - [0/363571] Loss : 1.2140 +2024-05-22 02:44:45,827 - INFO - Test loss : 0.6085 +2024-05-22 02:44:45,827 - INFO - +Epoch 44 +2024-05-22 02:44:46,252 - INFO - [0/363571] Loss : 1.2061 +2024-05-22 02:44:49,339 - INFO - Test loss : 0.6233 +2024-05-22 02:44:49,339 - INFO - +Epoch 45 +2024-05-22 02:44:49,767 - INFO - [0/363571] Loss : 1.2179 +2024-05-22 02:44:52,777 - INFO - Test loss : 0.6292 +2024-05-22 02:44:52,777 - INFO - +Epoch 46 +2024-05-22 02:44:53,198 - INFO - [0/363571] Loss : 1.2010 +2024-05-22 02:44:56,218 - INFO - Test loss : 0.6286 +2024-05-22 02:44:56,218 - INFO - +Epoch 47 +2024-05-22 02:44:56,721 - INFO - [0/363571] Loss : 1.2015 +2024-05-22 02:44:59,722 - INFO - Test loss : 0.6217 +2024-05-22 02:44:59,722 - INFO - +Epoch 48 +2024-05-22 02:45:00,145 - INFO - [0/363571] Loss : 1.2037 +2024-05-22 02:45:03,174 - INFO - Test loss : 0.6091 +2024-05-22 02:45:03,174 - INFO - +Epoch 49 +2024-05-22 02:45:03,686 - INFO - [0/363571] Loss : 1.1942 +2024-05-22 02:45:06,697 - INFO - Test loss : 0.5957 +2024-05-22 02:45:06,697 - INFO - +Epoch 50 +2024-05-22 02:45:07,121 - INFO - [0/363571] Loss : 1.1980 +2024-05-22 02:45:10,142 - INFO - Test loss : 0.5843 +2024-05-22 02:45:10,731 - INFO - (53041,) +2024-05-22 02:45:10,881 - INFO - Split ID: 0 +2024-05-22 02:45:10,900 - INFO - Top 1 LocEnc (Epoch 50)acc (%): 19.13 +2024-05-22 02:45:10,902 - INFO - Top 3 LocEnc (Epoch 50)acc (%): 32.82 +2024-05-22 02:45:10,904 - INFO - Top 5 LocEnc (Epoch 50)acc (%): 40.9 +2024-05-22 02:45:10,906 - INFO - Top 10 LocEnc (Epoch 50)acc (%): 55.19 +2024-05-22 02:45:10,912 - INFO - +No prior +2024-05-22 02:45:10,917 - INFO - (53041,) +2024-05-22 02:45:11,070 - INFO - Split ID: 0 +2024-05-22 02:45:11,071 - INFO - Top 1 (Epoch 50)acc (%): 69.83 +2024-05-22 02:45:11,071 - INFO - Top 3 (Epoch 50)acc (%): 84.61 +2024-05-22 02:45:11,071 - INFO - Top 5 (Epoch 50)acc (%): 89.23 +2024-05-22 02:45:11,071 - INFO - Top 10 (Epoch 50)acc (%): 94.29 +2024-05-22 02:45:33,281 - INFO - Split ID: 0 +2024-05-22 02:45:33,282 - INFO - Top 1 (Epoch 50)acc (%): 70.53 +2024-05-22 02:45:33,282 - INFO - Top 3 (Epoch 50)acc (%): 85.26 +2024-05-22 02:45:33,282 - INFO - Top 5 (Epoch 50)acc (%): 89.82 +2024-05-22 02:45:33,282 - INFO - Top 10 (Epoch 50)acc (%): 94.72 +2024-05-22 02:45:33,283 - INFO - +Epoch 51 +2024-05-22 02:45:33,708 - INFO - [0/363571] Loss : 1.1909 +2024-05-22 02:45:36,831 - INFO - Test loss : 0.5779 +2024-05-22 02:45:36,831 - INFO - +Epoch 52 +2024-05-22 02:45:37,257 - INFO - [0/363571] Loss : 1.1956 +2024-05-22 02:45:40,293 - INFO - Test loss : 0.5792 +2024-05-22 02:45:40,294 - INFO - +Epoch 53 +2024-05-22 02:45:40,720 - INFO - [0/363571] Loss : 1.1880 +2024-05-22 02:45:43,757 - INFO - Test loss : 0.5863 +2024-05-22 02:45:43,758 - INFO - +Epoch 54 +2024-05-22 02:45:44,267 - INFO - [0/363571] Loss : 1.1780 +2024-05-22 02:45:47,299 - INFO - Test loss : 0.5967 +2024-05-22 02:45:47,299 - INFO - +Epoch 55 +2024-05-22 02:45:47,728 - INFO - [0/363571] Loss : 1.1872 +2024-05-22 02:45:50,764 - INFO - Test loss : 0.6076 +2024-05-22 02:45:50,764 - INFO - +Epoch 56 +2024-05-22 02:45:51,273 - INFO - [0/363571] Loss : 1.1706 +2024-05-22 02:45:54,302 - INFO - Test loss : 0.6160 +2024-05-22 02:45:54,303 - INFO - +Epoch 57 +2024-05-22 02:45:54,727 - INFO - [0/363571] Loss : 1.1840 +2024-05-22 02:45:57,752 - INFO - Test loss : 0.6166 +2024-05-22 02:45:57,753 - INFO - +Epoch 58 +2024-05-22 02:45:58,262 - INFO - [0/363571] Loss : 1.1731 +2024-05-22 02:46:01,295 - INFO - Test loss : 0.6109 +2024-05-22 02:46:01,295 - INFO - +Epoch 59 +2024-05-22 02:46:01,722 - INFO - [0/363571] Loss : 1.1662 +2024-05-22 02:46:04,741 - INFO - Test loss : 0.6026 +2024-05-22 02:46:04,742 - INFO - +Epoch 60 +2024-05-22 02:46:05,167 - INFO - [0/363571] Loss : 1.1788 +2024-05-22 02:46:08,267 - INFO - Test loss : 0.5921 +2024-05-22 02:46:08,856 - INFO - (53041,) +2024-05-22 02:46:09,010 - INFO - Split ID: 0 +2024-05-22 02:46:09,032 - INFO - Top 1 LocEnc (Epoch 60)acc (%): 19.24 +2024-05-22 02:46:09,034 - INFO - Top 3 LocEnc (Epoch 60)acc (%): 33.05 +2024-05-22 02:46:09,036 - INFO - Top 5 LocEnc (Epoch 60)acc (%): 41.72 +2024-05-22 02:46:09,038 - INFO - Top 10 LocEnc (Epoch 60)acc (%): 55.91 +2024-05-22 02:46:09,044 - INFO - +No prior +2024-05-22 02:46:09,049 - INFO - (53041,) +2024-05-22 02:46:09,203 - INFO - Split ID: 0 +2024-05-22 02:46:09,203 - INFO - Top 1 (Epoch 60)acc (%): 69.83 +2024-05-22 02:46:09,203 - INFO - Top 3 (Epoch 60)acc (%): 84.61 +2024-05-22 02:46:09,203 - INFO - Top 5 (Epoch 60)acc (%): 89.23 +2024-05-22 02:46:09,204 - INFO - Top 10 (Epoch 60)acc (%): 94.29 +2024-05-22 02:46:31,124 - INFO - Split ID: 0 +2024-05-22 02:46:31,124 - INFO - Top 1 (Epoch 60)acc (%): 70.61 +2024-05-22 02:46:31,124 - INFO - Top 3 (Epoch 60)acc (%): 85.27 +2024-05-22 02:46:31,124 - INFO - Top 5 (Epoch 60)acc (%): 89.87 +2024-05-22 02:46:31,125 - INFO - Top 10 (Epoch 60)acc (%): 94.76 +2024-05-22 02:46:31,125 - INFO - +Epoch 61 +2024-05-22 02:46:31,551 - INFO - [0/363571] Loss : 1.1727 +2024-05-22 02:46:34,584 - INFO - Test loss : 0.5832 +2024-05-22 02:46:34,584 - INFO - +Epoch 62 +2024-05-22 02:46:35,012 - INFO - [0/363571] Loss : 1.1649 +2024-05-22 02:46:38,028 - INFO - Test loss : 0.5792 +2024-05-22 02:46:38,028 - INFO - +Epoch 63 +2024-05-22 02:46:38,537 - INFO - [0/363571] Loss : 1.1719 +2024-05-22 02:46:41,603 - INFO - Test loss : 0.5791 +2024-05-22 02:46:41,603 - INFO - +Epoch 64 +2024-05-22 02:46:42,029 - INFO - [0/363571] Loss : 1.1720 +2024-05-22 02:46:45,056 - INFO - Test loss : 0.5828 +2024-05-22 02:46:45,056 - INFO - +Epoch 65 +2024-05-22 02:46:45,567 - INFO - [0/363571] Loss : 1.1687 +2024-05-22 02:46:48,594 - INFO - Test loss : 0.5877 +2024-05-22 02:46:48,594 - INFO - +Epoch 66 +2024-05-22 02:46:49,019 - INFO - [0/363571] Loss : 1.1533 +2024-05-22 02:46:52,044 - INFO - Test loss : 0.5927 +2024-05-22 02:46:52,044 - INFO - +Epoch 67 +2024-05-22 02:46:52,470 - INFO - [0/363571] Loss : 1.1530 +2024-05-22 02:46:55,562 - INFO - Test loss : 0.5970 +2024-05-22 02:46:55,562 - INFO - +Epoch 68 +2024-05-22 02:46:55,988 - INFO - [0/363571] Loss : 1.1575 +2024-05-22 02:46:58,992 - INFO - Test loss : 0.5993 +2024-05-22 02:46:58,992 - INFO - +Epoch 69 +2024-05-22 02:46:59,423 - INFO - [0/363571] Loss : 1.1504 +2024-05-22 02:47:02,448 - INFO - Test loss : 0.5993 +2024-05-22 02:47:02,448 - INFO - +Epoch 70 +2024-05-22 02:47:02,956 - INFO - [0/363571] Loss : 1.1638 +2024-05-22 02:47:05,953 - INFO - Test loss : 0.5960 +2024-05-22 02:47:06,540 - INFO - (53041,) +2024-05-22 02:47:06,692 - INFO - Split ID: 0 +2024-05-22 02:47:06,711 - INFO - Top 1 LocEnc (Epoch 70)acc (%): 19.52 +2024-05-22 02:47:06,713 - INFO - Top 3 LocEnc (Epoch 70)acc (%): 33.86 +2024-05-22 02:47:06,714 - INFO - Top 5 LocEnc (Epoch 70)acc (%): 41.89 +2024-05-22 02:47:06,716 - INFO - Top 10 LocEnc (Epoch 70)acc (%): 55.71 +2024-05-22 02:47:06,722 - INFO - +No prior +2024-05-22 02:47:06,727 - INFO - (53041,) +2024-05-22 02:47:06,879 - INFO - Split ID: 0 +2024-05-22 02:47:06,879 - INFO - Top 1 (Epoch 70)acc (%): 69.83 +2024-05-22 02:47:06,880 - INFO - Top 3 (Epoch 70)acc (%): 84.61 +2024-05-22 02:47:06,880 - INFO - Top 5 (Epoch 70)acc (%): 89.23 +2024-05-22 02:47:06,880 - INFO - Top 10 (Epoch 70)acc (%): 94.29 +2024-05-22 02:47:28,839 - INFO - Split ID: 0 +2024-05-22 02:47:28,840 - INFO - Top 1 (Epoch 70)acc (%): 70.64 +2024-05-22 02:47:28,840 - INFO - Top 3 (Epoch 70)acc (%): 85.31 +2024-05-22 02:47:28,840 - INFO - Top 5 (Epoch 70)acc (%): 89.88 +2024-05-22 02:47:28,840 - INFO - Top 10 (Epoch 70)acc (%): 94.74 +2024-05-22 02:47:28,841 - INFO - +Epoch 71 +2024-05-22 02:47:29,266 - INFO - [0/363571] Loss : 1.1549 +2024-05-22 02:47:32,288 - INFO - Test loss : 0.5921 +2024-05-22 02:47:32,289 - INFO - +Epoch 72 +2024-05-22 02:47:32,802 - INFO - [0/363571] Loss : 1.1470 +2024-05-22 02:47:35,828 - INFO - Test loss : 0.5877 +2024-05-22 02:47:35,828 - INFO - +Epoch 73 +2024-05-22 02:47:36,254 - INFO - [0/363571] Loss : 1.1388 +2024-05-22 02:47:39,287 - INFO - Test loss : 0.5849 +2024-05-22 02:47:39,287 - INFO - +Epoch 74 +2024-05-22 02:47:39,807 - INFO - [0/363571] Loss : 1.1735 +2024-05-22 02:47:42,826 - INFO - Test loss : 0.5817 +2024-05-22 02:47:42,826 - INFO - +Epoch 75 +2024-05-22 02:47:43,253 - INFO - [0/363571] Loss : 1.1484 +2024-05-22 02:47:46,271 - INFO - Test loss : 0.5808 +2024-05-22 02:47:46,271 - INFO - +Epoch 76 +2024-05-22 02:47:46,694 - INFO - [0/363571] Loss : 1.1429 +2024-05-22 02:47:49,794 - INFO - Test loss : 0.5803 +2024-05-22 02:47:49,794 - INFO - +Epoch 77 +2024-05-22 02:47:50,221 - INFO - [0/363571] Loss : 1.1355 +2024-05-22 02:47:53,245 - INFO - Test loss : 0.5822 +2024-05-22 02:47:53,245 - INFO - +Epoch 78 +2024-05-22 02:47:53,669 - INFO - [0/363571] Loss : 1.1393 +2024-05-22 02:47:56,693 - INFO - Test loss : 0.5847 +2024-05-22 02:47:56,693 - INFO - +Epoch 79 +2024-05-22 02:47:57,199 - INFO - [0/363571] Loss : 1.1424 +2024-05-22 02:48:00,217 - INFO - Test loss : 0.5867 +2024-05-22 02:48:00,217 - INFO - +Epoch 80 +2024-05-22 02:48:00,642 - INFO - [0/363571] Loss : 1.1423 +2024-05-22 02:48:03,668 - INFO - Test loss : 0.5885 +2024-05-22 02:48:04,257 - INFO - (53041,) +2024-05-22 02:48:04,409 - INFO - Split ID: 0 +2024-05-22 02:48:04,427 - INFO - Top 1 LocEnc (Epoch 80)acc (%): 19.66 +2024-05-22 02:48:04,429 - INFO - Top 3 LocEnc (Epoch 80)acc (%): 33.83 +2024-05-22 02:48:04,431 - INFO - Top 5 LocEnc (Epoch 80)acc (%): 42.21 +2024-05-22 02:48:04,433 - INFO - Top 10 LocEnc (Epoch 80)acc (%): 56.61 +2024-05-22 02:48:04,439 - INFO - +No prior +2024-05-22 02:48:04,444 - INFO - (53041,) +2024-05-22 02:48:04,598 - INFO - Split ID: 0 +2024-05-22 02:48:04,598 - INFO - Top 1 (Epoch 80)acc (%): 69.83 +2024-05-22 02:48:04,598 - INFO - Top 3 (Epoch 80)acc (%): 84.61 +2024-05-22 02:48:04,598 - INFO - Top 5 (Epoch 80)acc (%): 89.23 +2024-05-22 02:48:04,598 - INFO - Top 10 (Epoch 80)acc (%): 94.29 +2024-05-22 02:48:26,559 - INFO - Split ID: 0 +2024-05-22 02:48:26,559 - INFO - Top 1 (Epoch 80)acc (%): 70.63 +2024-05-22 02:48:26,560 - INFO - Top 3 (Epoch 80)acc (%): 85.32 +2024-05-22 02:48:26,560 - INFO - Top 5 (Epoch 80)acc (%): 89.91 +2024-05-22 02:48:26,560 - INFO - Top 10 (Epoch 80)acc (%): 94.77 +2024-05-22 02:48:26,560 - INFO - +Epoch 81 +2024-05-22 02:48:27,070 - INFO - [0/363571] Loss : 1.1368 +2024-05-22 02:48:30,096 - INFO - Test loss : 0.5888 +2024-05-22 02:48:30,096 - INFO - +Epoch 82 +2024-05-22 02:48:30,523 - INFO - [0/363571] Loss : 1.1327 +2024-05-22 02:48:33,539 - INFO - Test loss : 0.5883 +2024-05-22 02:48:33,539 - INFO - +Epoch 83 +2024-05-22 02:48:33,974 - INFO - [0/363571] Loss : 1.1289 +2024-05-22 02:48:37,087 - INFO - Test loss : 0.5875 +2024-05-22 02:48:37,087 - INFO - +Epoch 84 +2024-05-22 02:48:37,515 - INFO - [0/363571] Loss : 1.1450 +2024-05-22 02:48:40,565 - INFO - Test loss : 0.5837 +2024-05-22 02:48:40,565 - INFO - +Epoch 85 +2024-05-22 02:48:40,993 - INFO - [0/363571] Loss : 1.1308 +2024-05-22 02:48:44,032 - INFO - Test loss : 0.5807 +2024-05-22 02:48:44,032 - INFO - +Epoch 86 +2024-05-22 02:48:44,541 - INFO - [0/363571] Loss : 1.1353 +2024-05-22 02:48:47,576 - INFO - Test loss : 0.5783 +2024-05-22 02:48:47,577 - INFO - +Epoch 87 +2024-05-22 02:48:48,005 - INFO - [0/363571] Loss : 1.1373 +2024-05-22 02:48:51,061 - INFO - Test loss : 0.5775 +2024-05-22 02:48:51,062 - INFO - +Epoch 88 +2024-05-22 02:48:51,574 - INFO - [0/363571] Loss : 1.1436 +2024-05-22 02:48:54,617 - INFO - Test loss : 0.5770 +2024-05-22 02:48:54,617 - INFO - +Epoch 89 +2024-05-22 02:48:55,046 - INFO - [0/363571] Loss : 1.1231 +2024-05-22 02:48:58,083 - INFO - Test loss : 0.5798 +2024-05-22 02:48:58,083 - INFO - +Epoch 90 +2024-05-22 02:48:58,596 - INFO - [0/363571] Loss : 1.1327 +2024-05-22 02:49:01,623 - INFO - Test loss : 0.5842 +2024-05-22 02:49:02,225 - INFO - (53041,) +2024-05-22 02:49:02,377 - INFO - Split ID: 0 +2024-05-22 02:49:02,396 - INFO - Top 1 LocEnc (Epoch 90)acc (%): 19.61 +2024-05-22 02:49:02,398 - INFO - Top 3 LocEnc (Epoch 90)acc (%): 33.55 +2024-05-22 02:49:02,399 - INFO - Top 5 LocEnc (Epoch 90)acc (%): 42.54 +2024-05-22 02:49:02,401 - INFO - Top 10 LocEnc (Epoch 90)acc (%): 56.78 +2024-05-22 02:49:02,407 - INFO - +No prior +2024-05-22 02:49:02,412 - INFO - (53041,) +2024-05-22 02:49:02,561 - INFO - Split ID: 0 +2024-05-22 02:49:02,561 - INFO - Top 1 (Epoch 90)acc (%): 69.83 +2024-05-22 02:49:02,561 - INFO - Top 3 (Epoch 90)acc (%): 84.61 +2024-05-22 02:49:02,561 - INFO - Top 5 (Epoch 90)acc (%): 89.23 +2024-05-22 02:49:02,561 - INFO - Top 10 (Epoch 90)acc (%): 94.29 +2024-05-22 02:49:24,409 - INFO - Split ID: 0 +2024-05-22 02:49:24,409 - INFO - Top 1 (Epoch 90)acc (%): 70.67 +2024-05-22 02:49:24,409 - INFO - Top 3 (Epoch 90)acc (%): 85.32 +2024-05-22 02:49:24,410 - INFO - Top 5 (Epoch 90)acc (%): 89.92 +2024-05-22 02:49:24,410 - INFO - Top 10 (Epoch 90)acc (%): 94.79 +2024-05-22 02:49:24,410 - INFO - +Epoch 91 +2024-05-22 02:49:24,829 - INFO - [0/363571] Loss : 1.1304 +2024-05-22 02:49:27,816 - INFO - Test loss : 0.5876 +2024-05-22 02:49:27,817 - INFO - +Epoch 92 +2024-05-22 02:49:28,246 - INFO - [0/363571] Loss : 1.1260 +2024-05-22 02:49:31,348 - INFO - Test loss : 0.5918 +2024-05-22 02:49:31,348 - INFO - +Epoch 93 +2024-05-22 02:49:31,772 - INFO - [0/363571] Loss : 1.1249 +2024-05-22 02:49:34,798 - INFO - Test loss : 0.5932 +2024-05-22 02:49:34,798 - INFO - +Epoch 94 +2024-05-22 02:49:35,221 - INFO - [0/363571] Loss : 1.1235 +2024-05-22 02:49:38,238 - INFO - Test loss : 0.5917 +2024-05-22 02:49:38,238 - INFO - +Epoch 95 +2024-05-22 02:49:38,744 - INFO - [0/363571] Loss : 1.1332 +2024-05-22 02:49:41,750 - INFO - Test loss : 0.5879 +2024-05-22 02:49:41,750 - INFO - +Epoch 96 +2024-05-22 02:49:42,177 - INFO - [0/363571] Loss : 1.1214 +2024-05-22 02:49:45,191 - INFO - Test loss : 0.5830 +2024-05-22 02:49:45,191 - INFO - +Epoch 97 +2024-05-22 02:49:45,702 - INFO - [0/363571] Loss : 1.1283 +2024-05-22 02:49:48,721 - INFO - Test loss : 0.5782 +2024-05-22 02:49:48,721 - INFO - +Epoch 98 +2024-05-22 02:49:49,144 - INFO - [0/363571] Loss : 1.1175 +2024-05-22 02:49:52,146 - INFO - Test loss : 0.5750 +2024-05-22 02:49:52,146 - INFO - +Epoch 99 +2024-05-22 02:49:52,570 - INFO - [0/363571] Loss : 1.1169 +2024-05-22 02:49:55,653 - INFO - Test loss : 0.5745 +2024-05-22 02:49:55,653 - INFO - +Epoch 100 +2024-05-22 02:49:56,078 - INFO - [0/363571] Loss : 1.1260 +2024-05-22 02:49:59,084 - INFO - Test loss : 0.5746 +2024-05-22 02:49:59,669 - INFO - (53041,) +2024-05-22 02:49:59,822 - INFO - Split ID: 0 +2024-05-22 02:49:59,844 - INFO - Top 1 LocEnc (Epoch 100)acc (%): 20.22 +2024-05-22 02:49:59,846 - INFO - Top 3 LocEnc (Epoch 100)acc (%): 34.38 +2024-05-22 02:49:59,847 - INFO - Top 5 LocEnc (Epoch 100)acc (%): 43.02 +2024-05-22 02:49:59,849 - INFO - Top 10 LocEnc (Epoch 100)acc (%): 57.37 +2024-05-22 02:49:59,855 - INFO - +No prior +2024-05-22 02:49:59,860 - INFO - (53041,) +2024-05-22 02:50:00,010 - INFO - Split ID: 0 +2024-05-22 02:50:00,011 - INFO - Top 1 (Epoch 100)acc (%): 69.83 +2024-05-22 02:50:00,011 - INFO - Top 3 (Epoch 100)acc (%): 84.61 +2024-05-22 02:50:00,011 - INFO - Top 5 (Epoch 100)acc (%): 89.23 +2024-05-22 02:50:00,011 - INFO - Top 10 (Epoch 100)acc (%): 94.29 +2024-05-22 02:50:21,903 - INFO - Split ID: 0 +2024-05-22 02:50:21,903 - INFO - Top 1 (Epoch 100)acc (%): 70.66 +2024-05-22 02:50:21,903 - INFO - Top 3 (Epoch 100)acc (%): 85.32 +2024-05-22 02:50:21,904 - INFO - Top 5 (Epoch 100)acc (%): 89.91 +2024-05-22 02:50:21,904 - INFO - Top 10 (Epoch 100)acc (%): 94.79 +2024-05-22 02:50:21,904 - INFO - +Epoch 101 +2024-05-22 02:50:22,328 - INFO - [0/363571] Loss : 1.1117 +2024-05-22 02:50:25,348 - INFO - Test loss : 0.5757 +2024-05-22 02:50:25,348 - INFO - +Epoch 102 +2024-05-22 02:50:25,854 - INFO - [0/363571] Loss : 1.1243 +2024-05-22 02:50:28,870 - INFO - Test loss : 0.5766 +2024-05-22 02:50:28,871 - INFO - +Epoch 103 +2024-05-22 02:50:29,296 - INFO - [0/363571] Loss : 1.1174 +2024-05-22 02:50:32,307 - INFO - Test loss : 0.5785 +2024-05-22 02:50:32,307 - INFO - +Epoch 104 +2024-05-22 02:50:32,815 - INFO - [0/363571] Loss : 1.1279 +2024-05-22 02:50:35,811 - INFO - Test loss : 0.5799 +2024-05-22 02:50:35,811 - INFO - +Epoch 105 +2024-05-22 02:50:36,236 - INFO - [0/363571] Loss : 1.1238 +2024-05-22 02:50:39,223 - INFO - Test loss : 0.5795 +2024-05-22 02:50:39,223 - INFO - +Epoch 106 +2024-05-22 02:50:39,735 - INFO - [0/363571] Loss : 1.1227 +2024-05-22 02:50:42,762 - INFO - Test loss : 0.5787 +2024-05-22 02:50:42,762 - INFO - +Epoch 107 +2024-05-22 02:50:43,188 - INFO - [0/363571] Loss : 1.1221 +2024-05-22 02:50:46,205 - INFO - Test loss : 0.5770 +2024-05-22 02:50:46,205 - INFO - +Epoch 108 +2024-05-22 02:50:46,630 - INFO - [0/363571] Loss : 1.1174 +2024-05-22 02:50:49,735 - INFO - Test loss : 0.5759 +2024-05-22 02:50:49,735 - INFO - +Epoch 109 +2024-05-22 02:50:50,160 - INFO - [0/363571] Loss : 1.1107 +2024-05-22 02:50:53,181 - INFO - Test loss : 0.5756 +2024-05-22 02:50:53,181 - INFO - +Epoch 110 +2024-05-22 02:50:53,605 - INFO - [0/363571] Loss : 1.1239 +2024-05-22 02:50:56,636 - INFO - Test loss : 0.5761 +2024-05-22 02:50:57,223 - INFO - (53041,) +2024-05-22 02:50:57,378 - INFO - Split ID: 0 +2024-05-22 02:50:57,400 - INFO - Top 1 LocEnc (Epoch 110)acc (%): 20.35 +2024-05-22 02:50:57,402 - INFO - Top 3 LocEnc (Epoch 110)acc (%): 34.74 +2024-05-22 02:50:57,404 - INFO - Top 5 LocEnc (Epoch 110)acc (%): 43.2 +2024-05-22 02:50:57,406 - INFO - Top 10 LocEnc (Epoch 110)acc (%): 57.33 +2024-05-22 02:50:57,412 - INFO - +No prior +2024-05-22 02:50:57,417 - INFO - (53041,) +2024-05-22 02:50:57,570 - INFO - Split ID: 0 +2024-05-22 02:50:57,571 - INFO - Top 1 (Epoch 110)acc (%): 69.83 +2024-05-22 02:50:57,571 - INFO - Top 3 (Epoch 110)acc (%): 84.61 +2024-05-22 02:50:57,571 - INFO - Top 5 (Epoch 110)acc (%): 89.23 +2024-05-22 02:50:57,571 - INFO - Top 10 (Epoch 110)acc (%): 94.29 +2024-05-22 02:51:19,505 - INFO - Split ID: 0 +2024-05-22 02:51:19,505 - INFO - Top 1 (Epoch 110)acc (%): 70.65 +2024-05-22 02:51:19,505 - INFO - Top 3 (Epoch 110)acc (%): 85.35 +2024-05-22 02:51:19,505 - INFO - Top 5 (Epoch 110)acc (%): 89.92 +2024-05-22 02:51:19,505 - INFO - Top 10 (Epoch 110)acc (%): 94.78 +2024-05-22 02:51:19,506 - INFO - +Epoch 111 +2024-05-22 02:51:20,013 - INFO - [0/363571] Loss : 1.1056 +2024-05-22 02:51:23,040 - INFO - Test loss : 0.5774 +2024-05-22 02:51:23,040 - INFO - +Epoch 112 +2024-05-22 02:51:23,466 - INFO - [0/363571] Loss : 1.1030 +2024-05-22 02:51:26,503 - INFO - Test loss : 0.5794 +2024-05-22 02:51:26,504 - INFO - +Epoch 113 +2024-05-22 02:51:27,018 - INFO - [0/363571] Loss : 1.1028 +2024-05-22 02:51:30,054 - INFO - Test loss : 0.5817 +2024-05-22 02:51:30,054 - INFO - +Epoch 114 +2024-05-22 02:51:30,481 - INFO - [0/363571] Loss : 1.1068 +2024-05-22 02:51:33,502 - INFO - Test loss : 0.5829 +2024-05-22 02:51:33,502 - INFO - +Epoch 115 +2024-05-22 02:51:33,929 - INFO - [0/363571] Loss : 1.1106 +2024-05-22 02:51:37,060 - INFO - Test loss : 0.5832 +2024-05-22 02:51:37,060 - INFO - +Epoch 116 +2024-05-22 02:51:37,487 - INFO - [0/363571] Loss : 1.1245 +2024-05-22 02:51:40,507 - INFO - Test loss : 0.5813 +2024-05-22 02:51:40,507 - INFO - +Epoch 117 +2024-05-22 02:51:40,936 - INFO - [0/363571] Loss : 1.1088 +2024-05-22 02:51:43,976 - INFO - Test loss : 0.5797 +2024-05-22 02:51:43,976 - INFO - +Epoch 118 +2024-05-22 02:51:44,485 - INFO - [0/363571] Loss : 1.1087 +2024-05-22 02:51:47,503 - INFO - Test loss : 0.5778 +2024-05-22 02:51:47,503 - INFO - +Epoch 119 +2024-05-22 02:51:47,929 - INFO - [0/363571] Loss : 1.1096 +2024-05-22 02:51:50,956 - INFO - Test loss : 0.5752 +2024-05-22 02:51:50,956 - INFO - Saving output model to ../models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-22 02:51:50,977 - INFO - Saving output model to ../models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-22 02:51:50,999 - INFO - +No prior +2024-05-22 02:51:51,004 - INFO - (53041,) +2024-05-22 02:51:51,160 - INFO - Split ID: 0 +2024-05-22 02:51:51,160 - INFO - Top 1 acc (%): 69.83 +2024-05-22 02:51:51,160 - INFO - Top 3 acc (%): 84.61 +2024-05-22 02:51:51,160 - INFO - Top 5 acc (%): 89.23 +2024-05-22 02:51:51,161 - INFO - Top 10 acc (%): 94.29 +2024-05-22 02:52:13,167 - INFO - Split ID: 0 +2024-05-22 02:52:13,167 - INFO - Top 1 acc (%): 70.67 +2024-05-22 02:52:13,168 - INFO - Top 3 acc (%): 85.34 +2024-05-22 02:52:13,168 - INFO - Top 5 acc (%): 89.92 +2024-05-22 02:52:13,168 - INFO - Top 10 acc (%): 94.79 +2024-05-22 02:52:13,170 - INFO - +Space2Vec-grid +2024-05-22 02:52:13,170 - INFO - Model : model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-22 02:52:13,791 - INFO - (53041,) +2024-05-22 02:52:13,950 - INFO - Split ID: 0 +2024-05-22 02:52:13,971 - INFO - Top 1 LocEnc acc (%): 20.32 +2024-05-22 02:52:13,973 - INFO - Top 3 LocEnc acc (%): 34.83 +2024-05-22 02:52:13,975 - INFO - Top 5 LocEnc acc (%): 43.27 +2024-05-22 02:52:13,977 - INFO - Top 10 LocEnc acc (%): 57.3 +2024-05-31 01:46:11,870 - INFO - +num_classes 62 +2024-05-31 01:46:11,870 - INFO - num train 363571 +2024-05-31 01:46:11,870 - INFO - num val 53041 +2024-05-31 01:46:11,870 - INFO - train loss full_loss +2024-05-31 01:46:11,870 - INFO - model name ../models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 01:46:11,870 - INFO - num users 1 +2024-05-31 01:46:12,756 - INFO - +Only Space2Vec-grid +2024-05-31 01:46:12,756 - INFO - Model : model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 01:46:12,831 - INFO - Saving output model to ../models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 01:46:12,870 - INFO - Saving output model to ../models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 01:46:12,905 - INFO - +No prior +2024-05-31 01:46:12,915 - INFO - (53041,) +2024-05-31 01:46:13,855 - INFO - Save results to ../eval_results/eval_fmow__val_no_prior.csv +2024-05-31 01:46:13,855 - INFO - Split ID: 0 +2024-05-31 01:46:13,855 - INFO - Top 1 acc (%): 69.83 +2024-05-31 01:46:13,855 - INFO - Top 3 acc (%): 84.61 +2024-05-31 01:46:13,855 - INFO - Top 5 acc (%): 89.23 +2024-05-31 01:46:13,856 - INFO - Top 10 acc (%): 94.29 +2024-05-31 01:46:39,340 - INFO - Split ID: 0 +2024-05-31 01:46:39,340 - INFO - Top 1 hit (%): 70.67 +2024-05-31 01:46:39,340 - INFO - Top 3 hit (%): 85.34 +2024-05-31 01:46:39,340 - INFO - Top 5 hit (%): 89.92 +2024-05-31 01:46:39,340 - INFO - Top 10 hit (%): 94.79 +2024-05-31 01:46:39,354 - INFO - +Only Space2Vec-grid +2024-05-31 01:46:39,354 - INFO - Model : model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 01:46:39,979 - INFO - (53041,) +2024-05-31 01:46:40,128 - INFO - Split ID: 0 +2024-05-31 01:46:40,145 - INFO - Top 1 LocEnc acc (%): 20.32 +2024-05-31 01:46:40,147 - INFO - Top 3 LocEnc acc (%): 34.83 +2024-05-31 01:46:40,149 - INFO - Top 5 LocEnc acc (%): 43.27 +2024-05-31 01:46:40,150 - INFO - Top 10 LocEnc acc (%): 57.3 diff --git a/pre_trained_models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar b/pre_trained_models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar new file mode 100755 index 00000000..348730c0 Binary files /dev/null and b/pre_trained_models/space2vec_grid/model_fmow_Space2Vec-grid_inception_v3_0.0100_128_0.0200000_360.000_1_512_BATCH8192_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_grid/model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.log b/pre_trained_models/space2vec_grid/model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..dd9c0b94 --- /dev/null +++ b/pre_trained_models/space2vec_grid/model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.log @@ -0,0 +1,737 @@ +2024-05-25 16:12:43,044 - INFO - +num_classes 5089 +2024-05-25 16:12:43,045 - INFO - num train 569465 +2024-05-25 16:12:43,045 - INFO - num val 93622 +2024-05-25 16:12:43,045 - INFO - train loss full_loss +2024-05-25 16:12:43,045 - INFO - model name ../models/space2vec_grid/model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-25 16:12:43,045 - INFO - num users 17302 +2024-05-25 16:12:43,933 - INFO - +Epoch 0 +2024-05-25 16:12:59,651 - INFO - [204800/569465] Loss : 1.4333 +2024-05-25 16:13:03,986 - INFO - [266240/569465] Loss : 1.3045 +2024-05-25 16:13:09,326 - INFO - Test loss : 0.3513 +2024-05-25 16:13:09,326 - INFO - +Epoch 1 +2024-05-25 16:13:24,592 - INFO - [204800/569465] Loss : 0.7542 +2024-05-25 16:13:28,934 - INFO - [266240/569465] Loss : 0.7438 +2024-05-25 16:13:34,277 - INFO - Test loss : 0.3558 +2024-05-25 16:13:34,277 - INFO - +Epoch 2 +2024-05-25 16:13:49,540 - INFO - [204800/569465] Loss : 0.6637 +2024-05-25 16:13:53,882 - INFO - [266240/569465] Loss : 0.6605 +2024-05-25 16:13:59,221 - INFO - Test loss : 0.3550 +2024-05-25 16:13:59,221 - INFO - +Epoch 3 +2024-05-25 16:14:14,491 - INFO - [204800/569465] Loss : 0.6158 +2024-05-25 16:14:18,849 - INFO - [266240/569465] Loss : 0.6139 +2024-05-25 16:14:24,209 - INFO - Test loss : 0.3873 +2024-05-25 16:14:24,209 - INFO - +Epoch 4 +2024-05-25 16:14:39,439 - INFO - [204800/569465] Loss : 0.5840 +2024-05-25 16:14:43,765 - INFO - [266240/569465] Loss : 0.5843 +2024-05-25 16:14:49,093 - INFO - Test loss : 0.3964 +2024-05-25 16:14:49,093 - INFO - +Epoch 5 +2024-05-25 16:15:04,320 - INFO - [204800/569465] Loss : 0.5630 +2024-05-25 16:15:08,596 - INFO - [266240/569465] Loss : 0.5629 +2024-05-25 16:15:13,997 - INFO - Test loss : 0.4452 +2024-05-25 16:15:13,998 - INFO - +Epoch 6 +2024-05-25 16:15:29,084 - INFO - [204800/569465] Loss : 0.5443 +2024-05-25 16:15:33,390 - INFO - [266240/569465] Loss : 0.5449 +2024-05-25 16:15:38,719 - INFO - Test loss : 0.4563 +2024-05-25 16:15:38,719 - INFO - +Epoch 7 +2024-05-25 16:15:53,840 - INFO - [204800/569465] Loss : 0.5303 +2024-05-25 16:15:58,175 - INFO - [266240/569465] Loss : 0.5314 +2024-05-25 16:16:03,522 - INFO - Test loss : 0.4822 +2024-05-25 16:16:03,522 - INFO - +Epoch 8 +2024-05-25 16:16:18,690 - INFO - [204800/569465] Loss : 0.5186 +2024-05-25 16:16:23,032 - INFO - [266240/569465] Loss : 0.5191 +2024-05-25 16:16:28,393 - INFO - Test loss : 0.5213 +2024-05-25 16:16:28,393 - INFO - +Epoch 9 +2024-05-25 16:16:43,643 - INFO - [204800/569465] Loss : 0.5093 +2024-05-25 16:16:47,998 - INFO - [266240/569465] Loss : 0.5101 +2024-05-25 16:16:53,319 - INFO - Test loss : 0.5480 +2024-05-25 16:16:53,319 - INFO - +Epoch 10 +2024-05-25 16:17:08,632 - INFO - [204800/569465] Loss : 0.5003 +2024-05-25 16:17:12,886 - INFO - [266240/569465] Loss : 0.5014 +2024-05-25 16:17:18,277 - INFO - Test loss : 0.5503 +2024-05-25 16:17:20,434 - INFO - (93622,) +2024-05-25 16:18:06,660 - INFO - Split ID: 0 +2024-05-25 16:18:06,698 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 0.86 +2024-05-25 16:18:06,701 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 2.42 +2024-05-25 16:18:06,704 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 3.77 +2024-05-25 16:18:06,707 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 6.69 +2024-05-25 16:18:06,960 - INFO - +No prior +2024-05-25 16:18:07,520 - INFO - (95986,) +2024-05-25 16:18:25,579 - INFO - Split ID: 0 +2024-05-25 16:18:25,580 - INFO - Top 1 (Epoch 10)acc (%): 63.27 +2024-05-25 16:18:25,580 - INFO - Top 3 (Epoch 10)acc (%): 79.82 +2024-05-25 16:18:25,580 - INFO - Top 5 (Epoch 10)acc (%): 84.51 +2024-05-25 16:18:25,580 - INFO - Top 10 (Epoch 10)acc (%): 88.99 +2024-05-25 16:19:10,155 - INFO - Split ID: 0 +2024-05-25 16:19:10,155 - INFO - Top 1 (Epoch 10)acc (%): 69.29 +2024-05-25 16:19:10,155 - INFO - Top 3 (Epoch 10)acc (%): 84.34 +2024-05-25 16:19:10,156 - INFO - Top 5 (Epoch 10)acc (%): 88.16 +2024-05-25 16:19:10,156 - INFO - Top 10 (Epoch 10)acc (%): 91.72 +2024-05-25 16:19:10,158 - INFO - +Epoch 11 +2024-05-25 16:19:25,455 - INFO - [204800/569465] Loss : 0.4917 +2024-05-25 16:19:29,685 - INFO - [266240/569465] Loss : 0.4933 +2024-05-25 16:19:35,100 - INFO - Test loss : 0.5632 +2024-05-25 16:19:35,101 - INFO - +Epoch 12 +2024-05-25 16:19:50,430 - INFO - [204800/569465] Loss : 0.4875 +2024-05-25 16:19:54,693 - INFO - [266240/569465] Loss : 0.4880 +2024-05-25 16:20:00,108 - INFO - Test loss : 0.5941 +2024-05-25 16:20:00,108 - INFO - +Epoch 13 +2024-05-25 16:20:15,358 - INFO - [204800/569465] Loss : 0.4776 +2024-05-25 16:20:19,608 - INFO - [266240/569465] Loss : 0.4792 +2024-05-25 16:20:25,026 - INFO - Test loss : 0.5891 +2024-05-25 16:20:25,026 - INFO - +Epoch 14 +2024-05-25 16:20:40,251 - INFO - [204800/569465] Loss : 0.4740 +2024-05-25 16:20:44,488 - INFO - [266240/569465] Loss : 0.4752 +2024-05-25 16:20:49,808 - INFO - Test loss : 0.6469 +2024-05-25 16:20:49,809 - INFO - +Epoch 15 +2024-05-25 16:21:05,022 - INFO - [204800/569465] Loss : 0.4689 +2024-05-25 16:21:09,355 - INFO - [266240/569465] Loss : 0.4702 +2024-05-25 16:21:14,677 - INFO - Test loss : 0.6074 +2024-05-25 16:21:14,678 - INFO - +Epoch 16 +2024-05-25 16:21:29,814 - INFO - [204800/569465] Loss : 0.4642 +2024-05-25 16:21:34,124 - INFO - [266240/569465] Loss : 0.4651 +2024-05-25 16:21:39,474 - INFO - Test loss : 0.6643 +2024-05-25 16:21:39,474 - INFO - +Epoch 17 +2024-05-25 16:21:54,684 - INFO - [204800/569465] Loss : 0.4596 +2024-05-25 16:21:59,013 - INFO - [266240/569465] Loss : 0.4615 +2024-05-25 16:22:04,354 - INFO - Test loss : 0.6403 +2024-05-25 16:22:04,355 - INFO - +Epoch 18 +2024-05-25 16:22:19,558 - INFO - [204800/569465] Loss : 0.4550 +2024-05-25 16:22:23,887 - INFO - [266240/569465] Loss : 0.4566 +2024-05-25 16:22:29,187 - INFO - Test loss : 0.6456 +2024-05-25 16:22:29,188 - INFO - +Epoch 19 +2024-05-25 16:22:44,450 - INFO - [204800/569465] Loss : 0.4524 +2024-05-25 16:22:48,779 - INFO - [266240/569465] Loss : 0.4539 +2024-05-25 16:22:54,090 - INFO - Test loss : 0.6531 +2024-05-25 16:22:54,090 - INFO - +Epoch 20 +2024-05-25 16:23:09,268 - INFO - [204800/569465] Loss : 0.4486 +2024-05-25 16:23:13,575 - INFO - [266240/569465] Loss : 0.4505 +2024-05-25 16:23:18,885 - INFO - Test loss : 0.6661 +2024-05-25 16:23:21,035 - INFO - (93622,) +2024-05-25 16:24:07,152 - INFO - Split ID: 0 +2024-05-25 16:24:07,191 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 1.04 +2024-05-25 16:24:07,194 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 2.78 +2024-05-25 16:24:07,197 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 4.23 +2024-05-25 16:24:07,200 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 7.49 +2024-05-25 16:24:07,471 - INFO - +No prior +2024-05-25 16:24:08,052 - INFO - (95986,) +2024-05-25 16:24:26,055 - INFO - Split ID: 0 +2024-05-25 16:24:26,056 - INFO - Top 1 (Epoch 20)acc (%): 63.27 +2024-05-25 16:24:26,056 - INFO - Top 3 (Epoch 20)acc (%): 79.82 +2024-05-25 16:24:26,056 - INFO - Top 5 (Epoch 20)acc (%): 84.51 +2024-05-25 16:24:26,056 - INFO - Top 10 (Epoch 20)acc (%): 88.99 +2024-05-25 16:25:10,285 - INFO - Split ID: 0 +2024-05-25 16:25:10,285 - INFO - Top 1 (Epoch 20)acc (%): 69.07 +2024-05-25 16:25:10,285 - INFO - Top 3 (Epoch 20)acc (%): 84.2 +2024-05-25 16:25:10,285 - INFO - Top 5 (Epoch 20)acc (%): 87.97 +2024-05-25 16:25:10,285 - INFO - Top 10 (Epoch 20)acc (%): 91.55 +2024-05-25 16:25:10,288 - INFO - +Epoch 21 +2024-05-25 16:25:25,466 - INFO - [204800/569465] Loss : 0.4463 +2024-05-25 16:25:29,773 - INFO - [266240/569465] Loss : 0.4474 +2024-05-25 16:25:35,108 - INFO - Test loss : 0.6965 +2024-05-25 16:25:35,108 - INFO - +Epoch 22 +2024-05-25 16:25:50,330 - INFO - [204800/569465] Loss : 0.4424 +2024-05-25 16:25:54,561 - INFO - [266240/569465] Loss : 0.4444 +2024-05-25 16:25:59,952 - INFO - Test loss : 0.6765 +2024-05-25 16:25:59,952 - INFO - +Epoch 23 +2024-05-25 16:26:15,112 - INFO - [204800/569465] Loss : 0.4396 +2024-05-25 16:26:19,424 - INFO - [266240/569465] Loss : 0.4412 +2024-05-25 16:26:24,735 - INFO - Test loss : 0.6977 +2024-05-25 16:26:24,735 - INFO - +Epoch 24 +2024-05-25 16:26:39,865 - INFO - [204800/569465] Loss : 0.4379 +2024-05-25 16:26:44,170 - INFO - [266240/569465] Loss : 0.4390 +2024-05-25 16:26:49,486 - INFO - Test loss : 0.7342 +2024-05-25 16:26:49,486 - INFO - +Epoch 25 +2024-05-25 16:27:04,670 - INFO - [204800/569465] Loss : 0.4348 +2024-05-25 16:27:09,026 - INFO - [266240/569465] Loss : 0.4361 +2024-05-25 16:27:14,390 - INFO - Test loss : 0.7281 +2024-05-25 16:27:14,390 - INFO - +Epoch 26 +2024-05-25 16:27:29,588 - INFO - [204800/569465] Loss : 0.4318 +2024-05-25 16:27:33,908 - INFO - [266240/569465] Loss : 0.4331 +2024-05-25 16:27:39,214 - INFO - Test loss : 0.7110 +2024-05-25 16:27:39,214 - INFO - +Epoch 27 +2024-05-25 16:27:54,383 - INFO - [204800/569465] Loss : 0.4293 +2024-05-25 16:27:58,613 - INFO - [266240/569465] Loss : 0.4309 +2024-05-25 16:28:04,024 - INFO - Test loss : 0.7217 +2024-05-25 16:28:04,024 - INFO - +Epoch 28 +2024-05-25 16:28:19,208 - INFO - [204800/569465] Loss : 0.4272 +2024-05-25 16:28:23,483 - INFO - [266240/569465] Loss : 0.4281 +2024-05-25 16:28:28,953 - INFO - Test loss : 0.7781 +2024-05-25 16:28:28,953 - INFO - +Epoch 29 +2024-05-25 16:28:44,219 - INFO - [204800/569465] Loss : 0.4259 +2024-05-25 16:28:48,461 - INFO - [266240/569465] Loss : 0.4271 +2024-05-25 16:28:53,870 - INFO - Test loss : 0.7592 +2024-05-25 16:28:53,870 - INFO - +Epoch 30 +2024-05-25 16:29:09,107 - INFO - [204800/569465] Loss : 0.4241 +2024-05-25 16:29:13,386 - INFO - [266240/569465] Loss : 0.4248 +2024-05-25 16:29:18,833 - INFO - Test loss : 0.7517 +2024-05-25 16:29:20,942 - INFO - (93622,) +2024-05-25 16:30:08,159 - INFO - Split ID: 0 +2024-05-25 16:30:08,196 - INFO - Top 1 LocEnc (Epoch 30)acc (%): 1.2 +2024-05-25 16:30:08,199 - INFO - Top 3 LocEnc (Epoch 30)acc (%): 2.95 +2024-05-25 16:30:08,202 - INFO - Top 5 LocEnc (Epoch 30)acc (%): 4.45 +2024-05-25 16:30:08,205 - INFO - Top 10 LocEnc (Epoch 30)acc (%): 7.83 +2024-05-25 16:30:08,456 - INFO - +No prior +2024-05-25 16:30:09,018 - INFO - (95986,) +2024-05-25 16:30:27,091 - INFO - Split ID: 0 +2024-05-25 16:30:27,091 - INFO - Top 1 (Epoch 30)acc (%): 63.27 +2024-05-25 16:30:27,091 - INFO - Top 3 (Epoch 30)acc (%): 79.82 +2024-05-25 16:30:27,092 - INFO - Top 5 (Epoch 30)acc (%): 84.51 +2024-05-25 16:30:27,092 - INFO - Top 10 (Epoch 30)acc (%): 88.99 +2024-05-25 16:31:11,432 - INFO - Split ID: 0 +2024-05-25 16:31:11,433 - INFO - Top 1 (Epoch 30)acc (%): 69.05 +2024-05-25 16:31:11,433 - INFO - Top 3 (Epoch 30)acc (%): 84.04 +2024-05-25 16:31:11,433 - INFO - Top 5 (Epoch 30)acc (%): 87.79 +2024-05-25 16:31:11,433 - INFO - Top 10 (Epoch 30)acc (%): 91.43 +2024-05-25 16:31:11,435 - INFO - +Epoch 31 +2024-05-25 16:31:26,667 - INFO - [204800/569465] Loss : 0.4216 +2024-05-25 16:31:30,910 - INFO - [266240/569465] Loss : 0.4228 +2024-05-25 16:31:36,240 - INFO - Test loss : 0.7896 +2024-05-25 16:31:36,240 - INFO - +Epoch 32 +2024-05-25 16:31:51,424 - INFO - [204800/569465] Loss : 0.4202 +2024-05-25 16:31:55,764 - INFO - [266240/569465] Loss : 0.4214 +2024-05-25 16:32:01,097 - INFO - Test loss : 0.7940 +2024-05-25 16:32:01,097 - INFO - +Epoch 33 +2024-05-25 16:32:16,355 - INFO - [204800/569465] Loss : 0.4177 +2024-05-25 16:32:20,682 - INFO - [266240/569465] Loss : 0.4190 +2024-05-25 16:32:26,015 - INFO - Test loss : 0.7945 +2024-05-25 16:32:26,015 - INFO - +Epoch 34 +2024-05-25 16:32:41,201 - INFO - [204800/569465] Loss : 0.4180 +2024-05-25 16:32:45,527 - INFO - [266240/569465] Loss : 0.4185 +2024-05-25 16:32:50,836 - INFO - Test loss : 0.7884 +2024-05-25 16:32:50,836 - INFO - +Epoch 35 +2024-05-25 16:33:05,977 - INFO - [204800/569465] Loss : 0.4155 +2024-05-25 16:33:10,311 - INFO - [266240/569465] Loss : 0.4169 +2024-05-25 16:33:15,623 - INFO - Test loss : 0.7988 +2024-05-25 16:33:15,623 - INFO - +Epoch 36 +2024-05-25 16:33:30,814 - INFO - [204800/569465] Loss : 0.4128 +2024-05-25 16:33:35,124 - INFO - [266240/569465] Loss : 0.4144 +2024-05-25 16:33:40,433 - INFO - Test loss : 0.8086 +2024-05-25 16:33:40,433 - INFO - +Epoch 37 +2024-05-25 16:33:55,571 - INFO - [204800/569465] Loss : 0.4127 +2024-05-25 16:33:59,875 - INFO - [266240/569465] Loss : 0.4135 +2024-05-25 16:34:05,197 - INFO - Test loss : 0.8478 +2024-05-25 16:34:05,197 - INFO - +Epoch 38 +2024-05-25 16:34:20,340 - INFO - [204800/569465] Loss : 0.4093 +2024-05-25 16:34:24,681 - INFO - [266240/569465] Loss : 0.4101 +2024-05-25 16:34:30,011 - INFO - Test loss : 0.8180 +2024-05-25 16:34:30,011 - INFO - +Epoch 39 +2024-05-25 16:34:45,152 - INFO - [204800/569465] Loss : 0.4096 +2024-05-25 16:34:49,372 - INFO - [266240/569465] Loss : 0.4106 +2024-05-25 16:34:54,770 - INFO - Test loss : 0.8232 +2024-05-25 16:34:54,771 - INFO - +Epoch 40 +2024-05-25 16:35:10,191 - INFO - [204800/569465] Loss : 0.4072 +2024-05-25 16:35:14,508 - INFO - [266240/569465] Loss : 0.4087 +2024-05-25 16:35:19,858 - INFO - Test loss : 0.8612 +2024-05-25 16:35:21,965 - INFO - (93622,) +2024-05-25 16:36:08,112 - INFO - Split ID: 0 +2024-05-25 16:36:08,149 - INFO - Top 1 LocEnc (Epoch 40)acc (%): 1.24 +2024-05-25 16:36:08,152 - INFO - Top 3 LocEnc (Epoch 40)acc (%): 3.1 +2024-05-25 16:36:08,155 - INFO - Top 5 LocEnc (Epoch 40)acc (%): 4.58 +2024-05-25 16:36:08,158 - INFO - Top 10 LocEnc (Epoch 40)acc (%): 7.84 +2024-05-25 16:36:08,412 - INFO - +No prior +2024-05-25 16:36:08,970 - INFO - (95986,) +2024-05-25 16:36:26,980 - INFO - Split ID: 0 +2024-05-25 16:36:26,981 - INFO - Top 1 (Epoch 40)acc (%): 63.27 +2024-05-25 16:36:26,981 - INFO - Top 3 (Epoch 40)acc (%): 79.82 +2024-05-25 16:36:26,981 - INFO - Top 5 (Epoch 40)acc (%): 84.51 +2024-05-25 16:36:26,981 - INFO - Top 10 (Epoch 40)acc (%): 88.99 +2024-05-25 16:37:11,379 - INFO - Split ID: 0 +2024-05-25 16:37:11,380 - INFO - Top 1 (Epoch 40)acc (%): 68.87 +2024-05-25 16:37:11,380 - INFO - Top 3 (Epoch 40)acc (%): 83.83 +2024-05-25 16:37:11,380 - INFO - Top 5 (Epoch 40)acc (%): 87.6 +2024-05-25 16:37:11,380 - INFO - Top 10 (Epoch 40)acc (%): 91.24 +2024-05-25 16:37:11,382 - INFO - +Epoch 41 +2024-05-25 16:37:26,627 - INFO - [204800/569465] Loss : 0.4061 +2024-05-25 16:37:30,962 - INFO - [266240/569465] Loss : 0.4070 +2024-05-25 16:37:36,337 - INFO - Test loss : 0.8710 +2024-05-25 16:37:36,338 - INFO - +Epoch 42 +2024-05-25 16:37:51,598 - INFO - [204800/569465] Loss : 0.4046 +2024-05-25 16:37:55,956 - INFO - [266240/569465] Loss : 0.4054 +2024-05-25 16:38:01,297 - INFO - Test loss : 0.8638 +2024-05-25 16:38:01,297 - INFO - +Epoch 43 +2024-05-25 16:38:16,585 - INFO - [204800/569465] Loss : 0.4033 +2024-05-25 16:38:20,938 - INFO - [266240/569465] Loss : 0.4039 +2024-05-25 16:38:26,283 - INFO - Test loss : 0.8640 +2024-05-25 16:38:26,283 - INFO - +Epoch 44 +2024-05-25 16:38:41,557 - INFO - [204800/569465] Loss : 0.4022 +2024-05-25 16:38:45,808 - INFO - [266240/569465] Loss : 0.4033 +2024-05-25 16:38:51,217 - INFO - Test loss : 0.8745 +2024-05-25 16:38:51,217 - INFO - +Epoch 45 +2024-05-25 16:39:06,475 - INFO - [204800/569465] Loss : 0.4004 +2024-05-25 16:39:10,715 - INFO - [266240/569465] Loss : 0.4010 +2024-05-25 16:39:16,125 - INFO - Test loss : 0.8881 +2024-05-25 16:39:16,125 - INFO - +Epoch 46 +2024-05-25 16:39:31,323 - INFO - [204800/569465] Loss : 0.3994 +2024-05-25 16:39:35,574 - INFO - [266240/569465] Loss : 0.3998 +2024-05-25 16:39:40,973 - INFO - Test loss : 0.9054 +2024-05-25 16:39:40,973 - INFO - +Epoch 47 +2024-05-25 16:39:56,233 - INFO - [204800/569465] Loss : 0.3987 +2024-05-25 16:40:00,446 - INFO - [266240/569465] Loss : 0.3996 +2024-05-25 16:40:05,849 - INFO - Test loss : 0.8815 +2024-05-25 16:40:05,849 - INFO - +Epoch 48 +2024-05-25 16:40:21,016 - INFO - [204800/569465] Loss : 0.3972 +2024-05-25 16:40:25,262 - INFO - [266240/569465] Loss : 0.3985 +2024-05-25 16:40:30,580 - INFO - Test loss : 0.8904 +2024-05-25 16:40:30,580 - INFO - +Epoch 49 +2024-05-25 16:40:45,809 - INFO - [204800/569465] Loss : 0.3968 +2024-05-25 16:40:50,216 - INFO - [266240/569465] Loss : 0.3974 +2024-05-25 16:40:55,597 - INFO - Test loss : 0.9020 +2024-05-25 16:40:55,597 - INFO - +Epoch 50 +2024-05-25 16:41:10,823 - INFO - [204800/569465] Loss : 0.3956 +2024-05-25 16:41:15,148 - INFO - [266240/569465] Loss : 0.3964 +2024-05-25 16:41:20,462 - INFO - Test loss : 0.9015 +2024-05-25 16:41:22,566 - INFO - (93622,) +2024-05-25 16:42:12,841 - INFO - Split ID: 0 +2024-05-25 16:42:12,878 - INFO - Top 1 LocEnc (Epoch 50)acc (%): 1.2 +2024-05-25 16:42:12,881 - INFO - Top 3 LocEnc (Epoch 50)acc (%): 3.14 +2024-05-25 16:42:12,884 - INFO - Top 5 LocEnc (Epoch 50)acc (%): 4.73 +2024-05-25 16:42:12,887 - INFO - Top 10 LocEnc (Epoch 50)acc (%): 8.14 +2024-05-25 16:42:13,141 - INFO - +No prior +2024-05-25 16:42:13,705 - INFO - (95986,) +2024-05-25 16:42:31,673 - INFO - Split ID: 0 +2024-05-25 16:42:31,674 - INFO - Top 1 (Epoch 50)acc (%): 63.27 +2024-05-25 16:42:31,674 - INFO - Top 3 (Epoch 50)acc (%): 79.82 +2024-05-25 16:42:31,674 - INFO - Top 5 (Epoch 50)acc (%): 84.51 +2024-05-25 16:42:31,674 - INFO - Top 10 (Epoch 50)acc (%): 88.99 +2024-05-25 16:43:15,955 - INFO - Split ID: 0 +2024-05-25 16:43:15,955 - INFO - Top 1 (Epoch 50)acc (%): 68.74 +2024-05-25 16:43:15,956 - INFO - Top 3 (Epoch 50)acc (%): 83.72 +2024-05-25 16:43:15,956 - INFO - Top 5 (Epoch 50)acc (%): 87.48 +2024-05-25 16:43:15,956 - INFO - Top 10 (Epoch 50)acc (%): 91.21 +2024-05-25 16:43:15,958 - INFO - +Epoch 51 +2024-05-25 16:43:31,083 - INFO - [204800/569465] Loss : 0.3930 +2024-05-25 16:43:35,413 - INFO - [266240/569465] Loss : 0.3946 +2024-05-25 16:43:40,741 - INFO - Test loss : 0.8927 +2024-05-25 16:43:40,742 - INFO - +Epoch 52 +2024-05-25 16:43:55,966 - INFO - [204800/569465] Loss : 0.3936 +2024-05-25 16:44:00,295 - INFO - [266240/569465] Loss : 0.3948 +2024-05-25 16:44:05,605 - INFO - Test loss : 0.8957 +2024-05-25 16:44:05,605 - INFO - +Epoch 53 +2024-05-25 16:44:20,837 - INFO - [204800/569465] Loss : 0.3933 +2024-05-25 16:44:25,201 - INFO - [266240/569465] Loss : 0.3938 +2024-05-25 16:44:30,539 - INFO - Test loss : 0.9141 +2024-05-25 16:44:30,540 - INFO - +Epoch 54 +2024-05-25 16:44:45,718 - INFO - [204800/569465] Loss : 0.3910 +2024-05-25 16:44:50,036 - INFO - [266240/569465] Loss : 0.3920 +2024-05-25 16:44:55,357 - INFO - Test loss : 0.9263 +2024-05-25 16:44:55,357 - INFO - +Epoch 55 +2024-05-25 16:45:10,520 - INFO - [204800/569465] Loss : 0.3899 +2024-05-25 16:45:14,840 - INFO - [266240/569465] Loss : 0.3908 +2024-05-25 16:45:20,136 - INFO - Test loss : 0.9317 +2024-05-25 16:45:20,137 - INFO - +Epoch 56 +2024-05-25 16:45:35,381 - INFO - [204800/569465] Loss : 0.3894 +2024-05-25 16:45:39,619 - INFO - [266240/569465] Loss : 0.3904 +2024-05-25 16:45:45,023 - INFO - Test loss : 0.9304 +2024-05-25 16:45:45,023 - INFO - +Epoch 57 +2024-05-25 16:46:00,307 - INFO - [204800/569465] Loss : 0.3883 +2024-05-25 16:46:04,621 - INFO - [266240/569465] Loss : 0.3891 +2024-05-25 16:46:09,980 - INFO - Test loss : 0.9293 +2024-05-25 16:46:09,980 - INFO - +Epoch 58 +2024-05-25 16:46:25,238 - INFO - [204800/569465] Loss : 0.3871 +2024-05-25 16:46:29,573 - INFO - [266240/569465] Loss : 0.3882 +2024-05-25 16:46:34,920 - INFO - Test loss : 0.9334 +2024-05-25 16:46:34,920 - INFO - +Epoch 59 +2024-05-25 16:46:50,172 - INFO - [204800/569465] Loss : 0.3872 +2024-05-25 16:46:54,522 - INFO - [266240/569465] Loss : 0.3882 +2024-05-25 16:46:59,857 - INFO - Test loss : 0.9536 +2024-05-25 16:46:59,857 - INFO - +Epoch 60 +2024-05-25 16:47:15,051 - INFO - [204800/569465] Loss : 0.3853 +2024-05-25 16:47:19,397 - INFO - [266240/569465] Loss : 0.3861 +2024-05-25 16:47:24,756 - INFO - Test loss : 0.9499 +2024-05-25 16:47:26,907 - INFO - (93622,) +2024-05-25 16:48:19,483 - INFO - Split ID: 0 +2024-05-25 16:48:19,520 - INFO - Top 1 LocEnc (Epoch 60)acc (%): 1.31 +2024-05-25 16:48:19,523 - INFO - Top 3 LocEnc (Epoch 60)acc (%): 3.22 +2024-05-25 16:48:19,526 - INFO - Top 5 LocEnc (Epoch 60)acc (%): 4.79 +2024-05-25 16:48:19,529 - INFO - Top 10 LocEnc (Epoch 60)acc (%): 8.3 +2024-05-25 16:48:19,774 - INFO - +No prior +2024-05-25 16:48:20,335 - INFO - (95986,) +2024-05-25 16:48:38,345 - INFO - Split ID: 0 +2024-05-25 16:48:38,345 - INFO - Top 1 (Epoch 60)acc (%): 63.27 +2024-05-25 16:48:38,345 - INFO - Top 3 (Epoch 60)acc (%): 79.82 +2024-05-25 16:48:38,346 - INFO - Top 5 (Epoch 60)acc (%): 84.51 +2024-05-25 16:48:38,346 - INFO - Top 10 (Epoch 60)acc (%): 88.99 +2024-05-25 16:49:22,554 - INFO - Split ID: 0 +2024-05-25 16:49:22,554 - INFO - Top 1 (Epoch 60)acc (%): 68.6 +2024-05-25 16:49:22,554 - INFO - Top 3 (Epoch 60)acc (%): 83.61 +2024-05-25 16:49:22,554 - INFO - Top 5 (Epoch 60)acc (%): 87.38 +2024-05-25 16:49:22,554 - INFO - Top 10 (Epoch 60)acc (%): 91.08 +2024-05-25 16:49:22,557 - INFO - +Epoch 61 +2024-05-25 16:49:37,788 - INFO - [204800/569465] Loss : 0.3853 +2024-05-25 16:49:42,024 - INFO - [266240/569465] Loss : 0.3864 +2024-05-25 16:49:47,407 - INFO - Test loss : 0.9453 +2024-05-25 16:49:47,407 - INFO - +Epoch 62 +2024-05-25 16:50:02,546 - INFO - [204800/569465] Loss : 0.3838 +2024-05-25 16:50:06,773 - INFO - [266240/569465] Loss : 0.3849 +2024-05-25 16:50:12,161 - INFO - Test loss : 0.9508 +2024-05-25 16:50:12,161 - INFO - +Epoch 63 +2024-05-25 16:50:27,296 - INFO - [204800/569465] Loss : 0.3846 +2024-05-25 16:50:31,522 - INFO - [266240/569465] Loss : 0.3849 +2024-05-25 16:50:36,900 - INFO - Test loss : 0.9704 +2024-05-25 16:50:36,900 - INFO - +Epoch 64 +2024-05-25 16:50:52,071 - INFO - [204800/569465] Loss : 0.3820 +2024-05-25 16:50:56,293 - INFO - [266240/569465] Loss : 0.3831 +2024-05-25 16:51:01,683 - INFO - Test loss : 0.9583 +2024-05-25 16:51:01,683 - INFO - +Epoch 65 +2024-05-25 16:51:16,865 - INFO - [204800/569465] Loss : 0.3827 +2024-05-25 16:51:21,139 - INFO - [266240/569465] Loss : 0.3831 +2024-05-25 16:51:26,548 - INFO - Test loss : 0.9537 +2024-05-25 16:51:26,548 - INFO - +Epoch 66 +2024-05-25 16:51:41,879 - INFO - [204800/569465] Loss : 0.3808 +2024-05-25 16:51:46,262 - INFO - [266240/569465] Loss : 0.3818 +2024-05-25 16:51:51,583 - INFO - Test loss : 0.9744 +2024-05-25 16:51:51,583 - INFO - +Epoch 67 +2024-05-25 16:52:06,808 - INFO - [204800/569465] Loss : 0.3806 +2024-05-25 16:52:11,171 - INFO - [266240/569465] Loss : 0.3811 +2024-05-25 16:52:16,495 - INFO - Test loss : 0.9849 +2024-05-25 16:52:16,495 - INFO - +Epoch 68 +2024-05-25 16:52:31,761 - INFO - [204800/569465] Loss : 0.3802 +2024-05-25 16:52:36,121 - INFO - [266240/569465] Loss : 0.3804 +2024-05-25 16:52:41,469 - INFO - Test loss : 0.9935 +2024-05-25 16:52:41,469 - INFO - +Epoch 69 +2024-05-25 16:52:56,664 - INFO - [204800/569465] Loss : 0.3792 +2024-05-25 16:53:01,014 - INFO - [266240/569465] Loss : 0.3802 +2024-05-25 16:53:06,373 - INFO - Test loss : 1.0081 +2024-05-25 16:53:06,373 - INFO - +Epoch 70 +2024-05-25 16:53:21,567 - INFO - [204800/569465] Loss : 0.3789 +2024-05-25 16:53:25,920 - INFO - [266240/569465] Loss : 0.3794 +2024-05-25 16:53:31,241 - INFO - Test loss : 0.9940 +2024-05-25 16:53:33,407 - INFO - (93622,) +2024-05-25 16:54:25,680 - INFO - Split ID: 0 +2024-05-25 16:54:25,716 - INFO - Top 1 LocEnc (Epoch 70)acc (%): 1.29 +2024-05-25 16:54:25,719 - INFO - Top 3 LocEnc (Epoch 70)acc (%): 3.2 +2024-05-25 16:54:25,722 - INFO - Top 5 LocEnc (Epoch 70)acc (%): 4.79 +2024-05-25 16:54:25,725 - INFO - Top 10 LocEnc (Epoch 70)acc (%): 8.22 +2024-05-25 16:54:25,968 - INFO - +No prior +2024-05-25 16:54:26,532 - INFO - (95986,) +2024-05-25 16:54:44,568 - INFO - Split ID: 0 +2024-05-25 16:54:44,568 - INFO - Top 1 (Epoch 70)acc (%): 63.27 +2024-05-25 16:54:44,569 - INFO - Top 3 (Epoch 70)acc (%): 79.82 +2024-05-25 16:54:44,569 - INFO - Top 5 (Epoch 70)acc (%): 84.51 +2024-05-25 16:54:44,569 - INFO - Top 10 (Epoch 70)acc (%): 88.99 +2024-05-25 16:55:28,864 - INFO - Split ID: 0 +2024-05-25 16:55:28,865 - INFO - Top 1 (Epoch 70)acc (%): 68.48 +2024-05-25 16:55:28,865 - INFO - Top 3 (Epoch 70)acc (%): 83.44 +2024-05-25 16:55:28,865 - INFO - Top 5 (Epoch 70)acc (%): 87.28 +2024-05-25 16:55:28,865 - INFO - Top 10 (Epoch 70)acc (%): 90.96 +2024-05-25 16:55:28,867 - INFO - +Epoch 71 +2024-05-25 16:55:48,249 - INFO - [204800/569465] Loss : 0.3779 +2024-05-25 16:55:53,070 - INFO - [266240/569465] Loss : 0.3776 +2024-05-25 16:55:58,398 - INFO - Test loss : 1.0182 +2024-05-25 16:55:58,398 - INFO - +Epoch 72 +2024-05-25 16:56:13,558 - INFO - [204800/569465] Loss : 0.3765 +2024-05-25 16:56:17,906 - INFO - [266240/569465] Loss : 0.3768 +2024-05-25 16:56:23,221 - INFO - Test loss : 1.0317 +2024-05-25 16:56:23,222 - INFO - +Epoch 73 +2024-05-25 16:56:38,488 - INFO - [204800/569465] Loss : 0.3758 +2024-05-25 16:56:42,749 - INFO - [266240/569465] Loss : 0.3768 +2024-05-25 16:56:48,149 - INFO - Test loss : 1.0022 +2024-05-25 16:56:48,149 - INFO - +Epoch 74 +2024-05-25 16:57:03,432 - INFO - [204800/569465] Loss : 0.3748 +2024-05-25 16:57:07,793 - INFO - [266240/569465] Loss : 0.3757 +2024-05-25 16:57:13,138 - INFO - Test loss : 1.0034 +2024-05-25 16:57:13,138 - INFO - +Epoch 75 +2024-05-25 16:57:28,418 - INFO - [204800/569465] Loss : 0.3738 +2024-05-25 16:57:32,751 - INFO - [266240/569465] Loss : 0.3746 +2024-05-25 16:57:38,147 - INFO - Test loss : 1.0083 +2024-05-25 16:57:38,147 - INFO - +Epoch 76 +2024-05-25 16:57:53,402 - INFO - [204800/569465] Loss : 0.3742 +2024-05-25 16:57:57,758 - INFO - [266240/569465] Loss : 0.3746 +2024-05-25 16:58:03,069 - INFO - Test loss : 1.0213 +2024-05-25 16:58:03,069 - INFO - +Epoch 77 +2024-05-25 16:58:18,315 - INFO - [204800/569465] Loss : 0.3738 +2024-05-25 16:58:22,719 - INFO - [266240/569465] Loss : 0.3742 +2024-05-25 16:58:28,092 - INFO - Test loss : 1.0292 +2024-05-25 16:58:28,092 - INFO - +Epoch 78 +2024-05-25 16:58:43,353 - INFO - [204800/569465] Loss : 0.3723 +2024-05-25 16:58:47,609 - INFO - [266240/569465] Loss : 0.3730 +2024-05-25 16:58:53,028 - INFO - Test loss : 1.0404 +2024-05-25 16:58:53,028 - INFO - +Epoch 79 +2024-05-25 16:59:08,260 - INFO - [204800/569465] Loss : 0.3718 +2024-05-25 16:59:12,540 - INFO - [266240/569465] Loss : 0.3731 +2024-05-25 16:59:17,982 - INFO - Test loss : 1.0334 +2024-05-25 16:59:17,982 - INFO - +Epoch 80 +2024-05-25 16:59:33,186 - INFO - [204800/569465] Loss : 0.3712 +2024-05-25 16:59:37,427 - INFO - [266240/569465] Loss : 0.3721 +2024-05-25 16:59:42,848 - INFO - Test loss : 1.0035 +2024-05-25 16:59:45,032 - INFO - (93622,) +2024-05-25 17:00:30,994 - INFO - Split ID: 0 +2024-05-25 17:00:31,032 - INFO - Top 1 LocEnc (Epoch 80)acc (%): 1.35 +2024-05-25 17:00:31,035 - INFO - Top 3 LocEnc (Epoch 80)acc (%): 3.32 +2024-05-25 17:00:31,038 - INFO - Top 5 LocEnc (Epoch 80)acc (%): 4.92 +2024-05-25 17:00:31,041 - INFO - Top 10 LocEnc (Epoch 80)acc (%): 8.49 +2024-05-25 17:00:31,296 - INFO - +No prior +2024-05-25 17:00:31,854 - INFO - (95986,) +2024-05-25 17:00:49,818 - INFO - Split ID: 0 +2024-05-25 17:00:49,818 - INFO - Top 1 (Epoch 80)acc (%): 63.27 +2024-05-25 17:00:49,818 - INFO - Top 3 (Epoch 80)acc (%): 79.82 +2024-05-25 17:00:49,818 - INFO - Top 5 (Epoch 80)acc (%): 84.51 +2024-05-25 17:00:49,819 - INFO - Top 10 (Epoch 80)acc (%): 88.99 +2024-05-25 17:01:34,213 - INFO - Split ID: 0 +2024-05-25 17:01:34,213 - INFO - Top 1 (Epoch 80)acc (%): 68.45 +2024-05-25 17:01:34,213 - INFO - Top 3 (Epoch 80)acc (%): 83.37 +2024-05-25 17:01:34,214 - INFO - Top 5 (Epoch 80)acc (%): 87.2 +2024-05-25 17:01:34,214 - INFO - Top 10 (Epoch 80)acc (%): 90.94 +2024-05-25 17:01:34,216 - INFO - +Epoch 81 +2024-05-25 17:01:49,443 - INFO - [204800/569465] Loss : 0.3721 +2024-05-25 17:01:53,693 - INFO - [266240/569465] Loss : 0.3725 +2024-05-25 17:01:59,101 - INFO - Test loss : 1.0373 +2024-05-25 17:01:59,102 - INFO - +Epoch 82 +2024-05-25 17:02:14,348 - INFO - [204800/569465] Loss : 0.3717 +2024-05-25 17:02:18,596 - INFO - [266240/569465] Loss : 0.3718 +2024-05-25 17:02:23,929 - INFO - Test loss : 1.0703 +2024-05-25 17:02:23,929 - INFO - +Epoch 83 +2024-05-25 17:02:39,214 - INFO - [204800/569465] Loss : 0.3705 +2024-05-25 17:02:43,560 - INFO - [266240/569465] Loss : 0.3709 +2024-05-25 17:02:48,868 - INFO - Test loss : 1.0494 +2024-05-25 17:02:48,869 - INFO - +Epoch 84 +2024-05-25 17:03:04,087 - INFO - [204800/569465] Loss : 0.3692 +2024-05-25 17:03:08,477 - INFO - [266240/569465] Loss : 0.3700 +2024-05-25 17:03:13,820 - INFO - Test loss : 1.0626 +2024-05-25 17:03:13,821 - INFO - +Epoch 85 +2024-05-25 17:03:29,203 - INFO - [204800/569465] Loss : 0.3696 +2024-05-25 17:03:33,542 - INFO - [266240/569465] Loss : 0.3697 +2024-05-25 17:03:38,907 - INFO - Test loss : 1.0632 +2024-05-25 17:03:38,907 - INFO - +Epoch 86 +2024-05-25 17:03:54,182 - INFO - [204800/569465] Loss : 0.3685 +2024-05-25 17:03:58,536 - INFO - [266240/569465] Loss : 0.3684 +2024-05-25 17:04:03,875 - INFO - Test loss : 1.0719 +2024-05-25 17:04:03,875 - INFO - +Epoch 87 +2024-05-25 17:04:19,239 - INFO - [204800/569465] Loss : 0.3678 +2024-05-25 17:04:23,574 - INFO - [266240/569465] Loss : 0.3685 +2024-05-25 17:04:28,911 - INFO - Test loss : 1.0614 +2024-05-25 17:04:28,911 - INFO - +Epoch 88 +2024-05-25 17:04:44,182 - INFO - [204800/569465] Loss : 0.3670 +2024-05-25 17:04:48,515 - INFO - [266240/569465] Loss : 0.3677 +2024-05-25 17:04:53,849 - INFO - Test loss : 1.0617 +2024-05-25 17:04:53,849 - INFO - +Epoch 89 +2024-05-25 17:05:09,102 - INFO - [204800/569465] Loss : 0.3671 +2024-05-25 17:05:13,454 - INFO - [266240/569465] Loss : 0.3673 +2024-05-25 17:05:18,798 - INFO - Test loss : 1.0785 +2024-05-25 17:05:18,798 - INFO - +Epoch 90 +2024-05-25 17:05:34,026 - INFO - [204800/569465] Loss : 0.3669 +2024-05-25 17:05:38,259 - INFO - [266240/569465] Loss : 0.3670 +2024-05-25 17:05:43,668 - INFO - Test loss : 1.0637 +2024-05-25 17:05:45,802 - INFO - (93622,) +2024-05-25 17:06:34,208 - INFO - Split ID: 0 +2024-05-25 17:06:34,245 - INFO - Top 1 LocEnc (Epoch 90)acc (%): 1.31 +2024-05-25 17:06:34,248 - INFO - Top 3 LocEnc (Epoch 90)acc (%): 3.39 +2024-05-25 17:06:34,251 - INFO - Top 5 LocEnc (Epoch 90)acc (%): 5.06 +2024-05-25 17:06:34,254 - INFO - Top 10 LocEnc (Epoch 90)acc (%): 8.48 +2024-05-25 17:06:34,496 - INFO - +No prior +2024-05-25 17:06:35,055 - INFO - (95986,) +2024-05-25 17:06:53,139 - INFO - Split ID: 0 +2024-05-25 17:06:53,140 - INFO - Top 1 (Epoch 90)acc (%): 63.27 +2024-05-25 17:06:53,140 - INFO - Top 3 (Epoch 90)acc (%): 79.82 +2024-05-25 17:06:53,140 - INFO - Top 5 (Epoch 90)acc (%): 84.51 +2024-05-25 17:06:53,140 - INFO - Top 10 (Epoch 90)acc (%): 88.99 +2024-05-25 17:07:37,531 - INFO - Split ID: 0 +2024-05-25 17:07:37,531 - INFO - Top 1 (Epoch 90)acc (%): 68.3 +2024-05-25 17:07:37,531 - INFO - Top 3 (Epoch 90)acc (%): 83.28 +2024-05-25 17:07:37,531 - INFO - Top 5 (Epoch 90)acc (%): 87.08 +2024-05-25 17:07:37,531 - INFO - Top 10 (Epoch 90)acc (%): 90.81 +2024-05-25 17:07:37,534 - INFO - +Epoch 91 +2024-05-25 17:07:52,768 - INFO - [204800/569465] Loss : 0.3661 +2024-05-25 17:07:57,125 - INFO - [266240/569465] Loss : 0.3669 +2024-05-25 17:08:02,464 - INFO - Test loss : 1.0762 +2024-05-25 17:08:02,465 - INFO - +Epoch 92 +2024-05-25 17:08:17,710 - INFO - [204800/569465] Loss : 0.3655 +2024-05-25 17:08:22,038 - INFO - [266240/569465] Loss : 0.3661 +2024-05-25 17:08:27,386 - INFO - Test loss : 1.0827 +2024-05-25 17:08:27,386 - INFO - +Epoch 93 +2024-05-25 17:08:42,617 - INFO - [204800/569465] Loss : 0.3648 +2024-05-25 17:08:46,949 - INFO - [266240/569465] Loss : 0.3657 +2024-05-25 17:08:52,267 - INFO - Test loss : 1.0871 +2024-05-25 17:08:52,267 - INFO - +Epoch 94 +2024-05-25 17:09:07,459 - INFO - [204800/569465] Loss : 0.3646 +2024-05-25 17:09:11,781 - INFO - [266240/569465] Loss : 0.3645 +2024-05-25 17:09:17,107 - INFO - Test loss : 1.1149 +2024-05-25 17:09:17,107 - INFO - +Epoch 95 +2024-05-25 17:09:32,331 - INFO - [204800/569465] Loss : 0.3634 +2024-05-25 17:09:36,559 - INFO - [266240/569465] Loss : 0.3643 +2024-05-25 17:09:41,962 - INFO - Test loss : 1.0906 +2024-05-25 17:09:41,962 - INFO - +Epoch 96 +2024-05-25 17:09:57,130 - INFO - [204800/569465] Loss : 0.3641 +2024-05-25 17:10:01,365 - INFO - [266240/569465] Loss : 0.3646 +2024-05-25 17:10:06,826 - INFO - Test loss : 1.0829 +2024-05-25 17:10:06,826 - INFO - +Epoch 97 +2024-05-25 17:10:22,087 - INFO - [204800/569465] Loss : 0.3626 +2024-05-25 17:10:26,332 - INFO - [266240/569465] Loss : 0.3631 +2024-05-25 17:10:31,729 - INFO - Test loss : 1.0940 +2024-05-25 17:10:31,729 - INFO - +Epoch 98 +2024-05-25 17:10:46,898 - INFO - [204800/569465] Loss : 0.3626 +2024-05-25 17:10:51,110 - INFO - [266240/569465] Loss : 0.3636 +2024-05-25 17:10:56,500 - INFO - Test loss : 1.0863 +2024-05-25 17:10:56,500 - INFO - +Epoch 99 +2024-05-25 17:11:11,744 - INFO - [204800/569465] Loss : 0.3628 +2024-05-25 17:11:15,998 - INFO - [266240/569465] Loss : 0.3630 +2024-05-25 17:11:21,344 - INFO - Test loss : 1.1084 +2024-05-25 17:11:21,344 - INFO - Saving output model to ../models/space2vec_grid/model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-25 17:11:21,393 - INFO - Saving output model to ../models/space2vec_grid/model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-25 17:11:21,564 - INFO - +No prior +2024-05-25 17:11:22,167 - INFO - (95986,) +2024-05-25 17:11:42,391 - INFO - Split ID: 0 +2024-05-25 17:11:42,392 - INFO - Top 1 acc (%): 63.27 +2024-05-25 17:11:42,392 - INFO - Top 3 acc (%): 79.82 +2024-05-25 17:11:42,392 - INFO - Top 5 acc (%): 84.51 +2024-05-25 17:11:42,392 - INFO - Top 10 acc (%): 88.99 +2024-05-25 17:12:26,901 - INFO - Split ID: 0 +2024-05-25 17:12:26,902 - INFO - Top 1 acc (%): 68.23 +2024-05-25 17:12:26,902 - INFO - Top 3 acc (%): 83.11 +2024-05-25 17:12:26,902 - INFO - Top 5 acc (%): 86.97 +2024-05-25 17:12:26,902 - INFO - Top 10 acc (%): 90.71 +2024-05-25 17:12:26,927 - INFO - +Space2Vec-grid +2024-05-25 17:12:26,927 - INFO - Model : model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-25 17:12:29,113 - INFO - (93622,) +2024-05-25 17:13:15,094 - INFO - Split ID: 0 +2024-05-25 17:13:15,131 - INFO - Top 1 LocEnc acc (%): 1.35 +2024-05-25 17:13:15,134 - INFO - Top 3 LocEnc acc (%): 3.39 +2024-05-25 17:13:15,137 - INFO - Top 5 LocEnc acc (%): 5.07 +2024-05-25 17:13:15,140 - INFO - Top 10 LocEnc acc (%): 8.59 +2024-05-31 01:42:28,979 - INFO - +num_classes 5089 +2024-05-31 01:42:28,979 - INFO - num train 569465 +2024-05-31 01:42:28,979 - INFO - num val 93622 +2024-05-31 01:42:28,979 - INFO - train loss full_loss +2024-05-31 01:42:28,979 - INFO - model name ../models/space2vec_grid/model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:42:28,979 - INFO - num users 17302 +2024-05-31 01:42:29,843 - INFO - +Only Space2Vec-grid +2024-05-31 01:42:29,843 - INFO - Model : model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:42:30,189 - INFO - Saving output model to ../models/space2vec_grid/model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:42:30,458 - INFO - Saving output model to ../models/space2vec_grid/model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:42:30,656 - INFO - +No prior +2024-05-31 01:42:31,299 - INFO - (95986,) +2024-05-31 01:43:02,766 - INFO - Save results to ../eval_results/eval_inat_2017__val_no_prior.csv +2024-05-31 01:43:02,767 - INFO - Split ID: 0 +2024-05-31 01:43:02,767 - INFO - Top 1 acc (%): 63.27 +2024-05-31 01:43:02,767 - INFO - Top 3 acc (%): 79.82 +2024-05-31 01:43:02,768 - INFO - Top 5 acc (%): 84.51 +2024-05-31 01:43:02,768 - INFO - Top 10 acc (%): 88.99 +2024-05-31 01:44:24,765 - INFO - Split ID: 0 +2024-05-31 01:44:24,766 - INFO - Top 1 hit (%): 68.23 +2024-05-31 01:44:24,766 - INFO - Top 3 hit (%): 83.11 +2024-05-31 01:44:24,766 - INFO - Top 5 hit (%): 86.97 +2024-05-31 01:44:24,766 - INFO - Top 10 hit (%): 90.71 +2024-05-31 01:44:24,793 - INFO - +Only Space2Vec-grid +2024-05-31 01:44:24,793 - INFO - Model : model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:44:27,134 - INFO - (93622,) +2024-05-31 01:45:19,423 - INFO - Split ID: 0 +2024-05-31 01:45:19,462 - INFO - Top 1 LocEnc acc (%): 1.35 +2024-05-31 01:45:19,465 - INFO - Top 3 LocEnc acc (%): 3.39 +2024-05-31 01:45:19,468 - INFO - Top 5 LocEnc acc (%): 5.07 +2024-05-31 01:45:19,471 - INFO - Top 10 LocEnc acc (%): 8.59 diff --git a/pre_trained_models/space2vec_grid/model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/space2vec_grid/model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..b65a26e3 Binary files /dev/null and b/pre_trained_models/space2vec_grid/model_inat_2017_Space2Vec-grid_inception_v3_0.0200_128_0.0050000_360.000_1_256_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_grid/model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.log b/pre_trained_models/space2vec_grid/model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..3fd340ad --- /dev/null +++ b/pre_trained_models/space2vec_grid/model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.log @@ -0,0 +1,737 @@ +2024-05-23 05:32:38,036 - INFO - +num_classes 8142 +2024-05-23 05:32:38,036 - INFO - num train 436063 +2024-05-23 05:32:38,036 - INFO - num val 24343 +2024-05-23 05:32:38,036 - INFO - train loss full_loss +2024-05-23 05:32:38,036 - INFO - model name ../models/space2vec_grid/model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-23 05:32:38,037 - INFO - num users 18643 +2024-05-23 05:32:38,976 - INFO - +Epoch 0 +2024-05-23 05:33:09,424 - INFO - [204800/436063] Loss : 1.1212 +2024-05-23 05:33:17,312 - INFO - [262144/436063] Loss : 1.0361 +2024-05-23 05:33:20,089 - INFO - Test loss : 0.2476 +2024-05-23 05:33:20,090 - INFO - +Epoch 1 +2024-05-23 05:33:48,853 - INFO - [204800/436063] Loss : 0.5766 +2024-05-23 05:33:56,042 - INFO - [262144/436063] Loss : 0.5677 +2024-05-23 05:33:59,213 - INFO - Test loss : 0.2012 +2024-05-23 05:33:59,214 - INFO - +Epoch 2 +2024-05-23 05:34:27,021 - INFO - [204800/436063] Loss : 0.4802 +2024-05-23 05:34:34,643 - INFO - [262144/436063] Loss : 0.4795 +2024-05-23 05:34:37,611 - INFO - Test loss : 0.1852 +2024-05-23 05:34:37,611 - INFO - +Epoch 3 +2024-05-23 05:35:06,149 - INFO - [204800/436063] Loss : 0.4398 +2024-05-23 05:35:13,347 - INFO - [262144/436063] Loss : 0.4390 +2024-05-23 05:35:16,041 - INFO - Test loss : 0.1881 +2024-05-23 05:35:16,041 - INFO - +Epoch 4 +2024-05-23 05:35:44,970 - INFO - [204800/436063] Loss : 0.4138 +2024-05-23 05:35:51,735 - INFO - [262144/436063] Loss : 0.4148 +2024-05-23 05:35:54,277 - INFO - Test loss : 0.1868 +2024-05-23 05:35:54,278 - INFO - +Epoch 5 +2024-05-23 05:36:20,959 - INFO - [204800/436063] Loss : 0.3962 +2024-05-23 05:36:28,434 - INFO - [262144/436063] Loss : 0.3973 +2024-05-23 05:36:30,978 - INFO - Test loss : 0.1967 +2024-05-23 05:36:30,978 - INFO - +Epoch 6 +2024-05-23 05:36:56,839 - INFO - [204800/436063] Loss : 0.3829 +2024-05-23 05:37:03,523 - INFO - [262144/436063] Loss : 0.3835 +2024-05-23 05:37:06,255 - INFO - Test loss : 0.2072 +2024-05-23 05:37:06,255 - INFO - +Epoch 7 +2024-05-23 05:37:32,965 - INFO - [204800/436063] Loss : 0.3708 +2024-05-23 05:37:39,903 - INFO - [262144/436063] Loss : 0.3721 +2024-05-23 05:37:42,556 - INFO - Test loss : 0.2102 +2024-05-23 05:37:42,556 - INFO - +Epoch 8 +2024-05-23 05:38:09,165 - INFO - [204800/436063] Loss : 0.3629 +2024-05-23 05:38:16,168 - INFO - [262144/436063] Loss : 0.3637 +2024-05-23 05:38:18,664 - INFO - Test loss : 0.2158 +2024-05-23 05:38:18,664 - INFO - +Epoch 9 +2024-05-23 05:38:46,007 - INFO - [204800/436063] Loss : 0.3556 +2024-05-23 05:38:53,095 - INFO - [262144/436063] Loss : 0.3558 +2024-05-23 05:38:55,843 - INFO - Test loss : 0.2321 +2024-05-23 05:38:55,843 - INFO - +Epoch 10 +2024-05-23 05:39:22,615 - INFO - [204800/436063] Loss : 0.3489 +2024-05-23 05:39:30,097 - INFO - [262144/436063] Loss : 0.3505 +2024-05-23 05:39:32,857 - INFO - Test loss : 0.2277 +2024-05-23 05:39:33,563 - INFO - (24343,) +2024-05-23 05:39:54,106 - INFO - Split ID: 0 +2024-05-23 05:39:54,115 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 1.59 +2024-05-23 05:39:54,116 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 4.43 +2024-05-23 05:39:54,117 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 7.08 +2024-05-23 05:39:54,118 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 12.34 +2024-05-23 05:39:54,218 - INFO - +No prior +2024-05-23 05:39:54,432 - INFO - (24426,) +2024-05-23 05:40:04,072 - INFO - Split ID: 0 +2024-05-23 05:40:04,072 - INFO - Top 1 (Epoch 10)acc (%): 60.2 +2024-05-23 05:40:04,072 - INFO - Top 3 (Epoch 10)acc (%): 77.9 +2024-05-23 05:40:04,072 - INFO - Top 5 (Epoch 10)acc (%): 83.29 +2024-05-23 05:40:04,072 - INFO - Top 10 (Epoch 10)acc (%): 88.5 +2024-05-23 05:40:33,999 - INFO - Split ID: 0 +2024-05-23 05:40:33,999 - INFO - Top 1 (Epoch 10)acc (%): 72.64 +2024-05-23 05:40:33,999 - INFO - Top 3 (Epoch 10)acc (%): 87.13 +2024-05-23 05:40:33,999 - INFO - Top 5 (Epoch 10)acc (%): 90.68 +2024-05-23 05:40:33,999 - INFO - Top 10 (Epoch 10)acc (%): 93.9 +2024-05-23 05:40:34,000 - INFO - +Epoch 11 +2024-05-23 05:41:00,650 - INFO - [204800/436063] Loss : 0.3429 +2024-05-23 05:41:07,549 - INFO - [262144/436063] Loss : 0.3438 +2024-05-23 05:41:10,106 - INFO - Test loss : 0.2411 +2024-05-23 05:41:10,106 - INFO - +Epoch 12 +2024-05-23 05:41:36,903 - INFO - [204800/436063] Loss : 0.3382 +2024-05-23 05:41:43,738 - INFO - [262144/436063] Loss : 0.3395 +2024-05-23 05:41:46,196 - INFO - Test loss : 0.2388 +2024-05-23 05:41:46,196 - INFO - +Epoch 13 +2024-05-23 05:42:12,803 - INFO - [204800/436063] Loss : 0.3343 +2024-05-23 05:42:20,440 - INFO - [262144/436063] Loss : 0.3351 +2024-05-23 05:42:22,979 - INFO - Test loss : 0.2510 +2024-05-23 05:42:22,979 - INFO - +Epoch 14 +2024-05-23 05:42:49,782 - INFO - [204800/436063] Loss : 0.3292 +2024-05-23 05:42:57,008 - INFO - [262144/436063] Loss : 0.3304 +2024-05-23 05:42:59,922 - INFO - Test loss : 0.2415 +2024-05-23 05:42:59,922 - INFO - +Epoch 15 +2024-05-23 05:43:26,544 - INFO - [204800/436063] Loss : 0.3253 +2024-05-23 05:43:33,638 - INFO - [262144/436063] Loss : 0.3268 +2024-05-23 05:43:36,175 - INFO - Test loss : 0.2469 +2024-05-23 05:43:36,176 - INFO - +Epoch 16 +2024-05-23 05:44:02,397 - INFO - [204800/436063] Loss : 0.3220 +2024-05-23 05:44:09,172 - INFO - [262144/436063] Loss : 0.3234 +2024-05-23 05:44:11,672 - INFO - Test loss : 0.2457 +2024-05-23 05:44:11,673 - INFO - +Epoch 17 +2024-05-23 05:44:39,064 - INFO - [204800/436063] Loss : 0.3186 +2024-05-23 05:44:46,604 - INFO - [262144/436063] Loss : 0.3199 +2024-05-23 05:44:49,714 - INFO - Test loss : 0.2599 +2024-05-23 05:44:49,714 - INFO - +Epoch 18 +2024-05-23 05:45:18,186 - INFO - [204800/436063] Loss : 0.3156 +2024-05-23 05:45:24,952 - INFO - [262144/436063] Loss : 0.3167 +2024-05-23 05:45:27,605 - INFO - Test loss : 0.2643 +2024-05-23 05:45:27,605 - INFO - +Epoch 19 +2024-05-23 05:45:55,661 - INFO - [204800/436063] Loss : 0.3135 +2024-05-23 05:46:02,502 - INFO - [262144/436063] Loss : 0.3145 +2024-05-23 05:46:05,336 - INFO - Test loss : 0.2652 +2024-05-23 05:46:05,336 - INFO - +Epoch 20 +2024-05-23 05:46:32,027 - INFO - [204800/436063] Loss : 0.3110 +2024-05-23 05:46:38,750 - INFO - [262144/436063] Loss : 0.3121 +2024-05-23 05:46:41,262 - INFO - Test loss : 0.2776 +2024-05-23 05:46:41,918 - INFO - (24343,) +2024-05-23 05:47:02,505 - INFO - Split ID: 0 +2024-05-23 05:47:02,514 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 2.27 +2024-05-23 05:47:02,515 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 5.93 +2024-05-23 05:47:02,516 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 9.08 +2024-05-23 05:47:02,517 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 15.54 +2024-05-23 05:47:02,626 - INFO - +No prior +2024-05-23 05:47:02,860 - INFO - (24426,) +2024-05-23 05:47:12,503 - INFO - Split ID: 0 +2024-05-23 05:47:12,503 - INFO - Top 1 (Epoch 20)acc (%): 60.2 +2024-05-23 05:47:12,503 - INFO - Top 3 (Epoch 20)acc (%): 77.9 +2024-05-23 05:47:12,503 - INFO - Top 5 (Epoch 20)acc (%): 83.29 +2024-05-23 05:47:12,503 - INFO - Top 10 (Epoch 20)acc (%): 88.5 +2024-05-23 05:47:40,595 - INFO - Split ID: 0 +2024-05-23 05:47:40,595 - INFO - Top 1 (Epoch 20)acc (%): 72.84 +2024-05-23 05:47:40,595 - INFO - Top 3 (Epoch 20)acc (%): 87.19 +2024-05-23 05:47:40,595 - INFO - Top 5 (Epoch 20)acc (%): 90.61 +2024-05-23 05:47:40,595 - INFO - Top 10 (Epoch 20)acc (%): 93.84 +2024-05-23 05:47:40,595 - INFO - +Epoch 21 +2024-05-23 05:48:06,944 - INFO - [204800/436063] Loss : 0.3080 +2024-05-23 05:48:13,877 - INFO - [262144/436063] Loss : 0.3092 +2024-05-23 05:48:16,338 - INFO - Test loss : 0.2837 +2024-05-23 05:48:16,339 - INFO - +Epoch 22 +2024-05-23 05:48:43,773 - INFO - [204800/436063] Loss : 0.3057 +2024-05-23 05:48:50,882 - INFO - [262144/436063] Loss : 0.3070 +2024-05-23 05:48:53,540 - INFO - Test loss : 0.2748 +2024-05-23 05:48:53,540 - INFO - +Epoch 23 +2024-05-23 05:49:22,125 - INFO - [204800/436063] Loss : 0.3032 +2024-05-23 05:49:29,354 - INFO - [262144/436063] Loss : 0.3043 +2024-05-23 05:49:32,110 - INFO - Test loss : 0.2850 +2024-05-23 05:49:32,110 - INFO - +Epoch 24 +2024-05-23 05:50:01,249 - INFO - [204800/436063] Loss : 0.3026 +2024-05-23 05:50:08,635 - INFO - [262144/436063] Loss : 0.3036 +2024-05-23 05:50:11,269 - INFO - Test loss : 0.3011 +2024-05-23 05:50:11,269 - INFO - +Epoch 25 +2024-05-23 05:50:39,605 - INFO - [204800/436063] Loss : 0.2990 +2024-05-23 05:50:46,553 - INFO - [262144/436063] Loss : 0.3000 +2024-05-23 05:50:49,718 - INFO - Test loss : 0.2937 +2024-05-23 05:50:49,718 - INFO - +Epoch 26 +2024-05-23 05:51:19,015 - INFO - [204800/436063] Loss : 0.2978 +2024-05-23 05:51:26,420 - INFO - [262144/436063] Loss : 0.2989 +2024-05-23 05:51:29,128 - INFO - Test loss : 0.3055 +2024-05-23 05:51:29,128 - INFO - +Epoch 27 +2024-05-23 05:51:57,573 - INFO - [204800/436063] Loss : 0.2965 +2024-05-23 05:52:05,529 - INFO - [262144/436063] Loss : 0.2973 +2024-05-23 05:52:08,328 - INFO - Test loss : 0.3043 +2024-05-23 05:52:08,328 - INFO - +Epoch 28 +2024-05-23 05:52:36,240 - INFO - [204800/436063] Loss : 0.2944 +2024-05-23 05:52:43,060 - INFO - [262144/436063] Loss : 0.2956 +2024-05-23 05:52:45,816 - INFO - Test loss : 0.3125 +2024-05-23 05:52:45,816 - INFO - +Epoch 29 +2024-05-23 05:53:26,604 - INFO - [204800/436063] Loss : 0.2924 +2024-05-23 05:53:37,679 - INFO - [262144/436063] Loss : 0.2938 +2024-05-23 05:53:41,881 - INFO - Test loss : 0.3090 +2024-05-23 05:53:41,881 - INFO - +Epoch 30 +2024-05-23 05:54:22,428 - INFO - [204800/436063] Loss : 0.2912 +2024-05-23 05:54:33,469 - INFO - [262144/436063] Loss : 0.2922 +2024-05-23 05:54:37,703 - INFO - Test loss : 0.3120 +2024-05-23 05:54:38,378 - INFO - (24343,) +2024-05-23 05:54:58,914 - INFO - Split ID: 0 +2024-05-23 05:54:58,924 - INFO - Top 1 LocEnc (Epoch 30)acc (%): 2.88 +2024-05-23 05:54:58,924 - INFO - Top 3 LocEnc (Epoch 30)acc (%): 7.01 +2024-05-23 05:54:58,925 - INFO - Top 5 LocEnc (Epoch 30)acc (%): 10.62 +2024-05-23 05:54:58,926 - INFO - Top 10 LocEnc (Epoch 30)acc (%): 17.42 +2024-05-23 05:54:59,027 - INFO - +No prior +2024-05-23 05:54:59,244 - INFO - (24426,) +2024-05-23 05:55:08,869 - INFO - Split ID: 0 +2024-05-23 05:55:08,870 - INFO - Top 1 (Epoch 30)acc (%): 60.2 +2024-05-23 05:55:08,871 - INFO - Top 3 (Epoch 30)acc (%): 77.9 +2024-05-23 05:55:08,871 - INFO - Top 5 (Epoch 30)acc (%): 83.29 +2024-05-23 05:55:08,871 - INFO - Top 10 (Epoch 30)acc (%): 88.5 +2024-05-23 05:55:59,139 - INFO - Split ID: 0 +2024-05-23 05:55:59,140 - INFO - Top 1 (Epoch 30)acc (%): 72.94 +2024-05-23 05:55:59,140 - INFO - Top 3 (Epoch 30)acc (%): 87.16 +2024-05-23 05:55:59,140 - INFO - Top 5 (Epoch 30)acc (%): 90.51 +2024-05-23 05:55:59,140 - INFO - Top 10 (Epoch 30)acc (%): 93.74 +2024-05-23 05:55:59,140 - INFO - +Epoch 31 +2024-05-23 05:56:51,490 - INFO - [204800/436063] Loss : 0.2902 +2024-05-23 05:57:05,021 - INFO - [262144/436063] Loss : 0.2909 +2024-05-23 05:57:10,391 - INFO - Test loss : 0.3356 +2024-05-23 05:57:10,391 - INFO - +Epoch 32 +2024-05-23 05:58:01,783 - INFO - [204800/436063] Loss : 0.2880 +2024-05-23 05:58:14,540 - INFO - [262144/436063] Loss : 0.2888 +2024-05-23 05:58:20,035 - INFO - Test loss : 0.3137 +2024-05-23 05:58:20,035 - INFO - +Epoch 33 +2024-05-23 05:59:10,669 - INFO - [204800/436063] Loss : 0.2868 +2024-05-23 05:59:23,395 - INFO - [262144/436063] Loss : 0.2883 +2024-05-23 05:59:28,831 - INFO - Test loss : 0.3155 +2024-05-23 05:59:28,831 - INFO - +Epoch 34 +2024-05-23 06:00:20,739 - INFO - [204800/436063] Loss : 0.2864 +2024-05-23 06:00:34,139 - INFO - [262144/436063] Loss : 0.2874 +2024-05-23 06:00:38,765 - INFO - Test loss : 0.3280 +2024-05-23 06:00:38,765 - INFO - +Epoch 35 +2024-05-23 06:01:29,415 - INFO - [204800/436063] Loss : 0.2845 +2024-05-23 06:01:43,827 - INFO - [262144/436063] Loss : 0.2854 +2024-05-23 06:01:49,364 - INFO - Test loss : 0.3337 +2024-05-23 06:01:49,364 - INFO - +Epoch 36 +2024-05-23 06:02:39,119 - INFO - [204800/436063] Loss : 0.2839 +2024-05-23 06:02:52,740 - INFO - [262144/436063] Loss : 0.2848 +2024-05-23 06:02:57,592 - INFO - Test loss : 0.3352 +2024-05-23 06:02:57,592 - INFO - +Epoch 37 +2024-05-23 06:03:50,089 - INFO - [204800/436063] Loss : 0.2823 +2024-05-23 06:04:03,674 - INFO - [262144/436063] Loss : 0.2829 +2024-05-23 06:04:08,635 - INFO - Test loss : 0.3521 +2024-05-23 06:04:08,636 - INFO - +Epoch 38 +2024-05-23 06:05:00,680 - INFO - [204800/436063] Loss : 0.2814 +2024-05-23 06:05:13,938 - INFO - [262144/436063] Loss : 0.2818 +2024-05-23 06:05:18,945 - INFO - Test loss : 0.3494 +2024-05-23 06:05:18,946 - INFO - +Epoch 39 +2024-05-23 06:06:08,693 - INFO - [204800/436063] Loss : 0.2807 +2024-05-23 06:06:22,882 - INFO - [262144/436063] Loss : 0.2813 +2024-05-23 06:06:28,520 - INFO - Test loss : 0.3510 +2024-05-23 06:06:28,520 - INFO - +Epoch 40 +2024-05-23 06:07:18,836 - INFO - [204800/436063] Loss : 0.2792 +2024-05-23 06:07:32,254 - INFO - [262144/436063] Loss : 0.2800 +2024-05-23 06:07:36,876 - INFO - Test loss : 0.3501 +2024-05-23 06:07:37,604 - INFO - (24343,) +2024-05-23 06:07:58,108 - INFO - Split ID: 0 +2024-05-23 06:07:58,117 - INFO - Top 1 LocEnc (Epoch 40)acc (%): 2.98 +2024-05-23 06:07:58,118 - INFO - Top 3 LocEnc (Epoch 40)acc (%): 7.48 +2024-05-23 06:07:58,119 - INFO - Top 5 LocEnc (Epoch 40)acc (%): 11.11 +2024-05-23 06:07:58,119 - INFO - Top 10 LocEnc (Epoch 40)acc (%): 18.35 +2024-05-23 06:07:58,224 - INFO - +No prior +2024-05-23 06:07:58,450 - INFO - (24426,) +2024-05-23 06:08:08,081 - INFO - Split ID: 0 +2024-05-23 06:08:08,081 - INFO - Top 1 (Epoch 40)acc (%): 60.2 +2024-05-23 06:08:08,081 - INFO - Top 3 (Epoch 40)acc (%): 77.9 +2024-05-23 06:08:08,081 - INFO - Top 5 (Epoch 40)acc (%): 83.29 +2024-05-23 06:08:08,081 - INFO - Top 10 (Epoch 40)acc (%): 88.5 +2024-05-23 06:08:58,254 - INFO - Split ID: 0 +2024-05-23 06:08:58,254 - INFO - Top 1 (Epoch 40)acc (%): 72.94 +2024-05-23 06:08:58,254 - INFO - Top 3 (Epoch 40)acc (%): 87.15 +2024-05-23 06:08:58,255 - INFO - Top 5 (Epoch 40)acc (%): 90.52 +2024-05-23 06:08:58,255 - INFO - Top 10 (Epoch 40)acc (%): 93.68 +2024-05-23 06:08:58,255 - INFO - +Epoch 41 +2024-05-23 06:09:50,745 - INFO - [204800/436063] Loss : 0.2782 +2024-05-23 06:10:04,832 - INFO - [262144/436063] Loss : 0.2792 +2024-05-23 06:10:10,411 - INFO - Test loss : 0.3579 +2024-05-23 06:10:10,411 - INFO - +Epoch 42 +2024-05-23 06:11:02,655 - INFO - [204800/436063] Loss : 0.2780 +2024-05-23 06:11:16,700 - INFO - [262144/436063] Loss : 0.2789 +2024-05-23 06:11:21,988 - INFO - Test loss : 0.3540 +2024-05-23 06:11:21,988 - INFO - +Epoch 43 +2024-05-23 06:12:14,373 - INFO - [204800/436063] Loss : 0.2770 +2024-05-23 06:12:27,142 - INFO - [262144/436063] Loss : 0.2779 +2024-05-23 06:12:32,538 - INFO - Test loss : 0.3522 +2024-05-23 06:12:32,538 - INFO - +Epoch 44 +2024-05-23 06:13:23,796 - INFO - [204800/436063] Loss : 0.2754 +2024-05-23 06:13:37,609 - INFO - [262144/436063] Loss : 0.2762 +2024-05-23 06:13:42,434 - INFO - Test loss : 0.3580 +2024-05-23 06:13:42,434 - INFO - +Epoch 45 +2024-05-23 06:14:32,888 - INFO - [204800/436063] Loss : 0.2747 +2024-05-23 06:14:46,222 - INFO - [262144/436063] Loss : 0.2755 +2024-05-23 06:14:51,946 - INFO - Test loss : 0.3681 +2024-05-23 06:14:51,946 - INFO - +Epoch 46 +2024-05-23 06:15:41,850 - INFO - [204800/436063] Loss : 0.2738 +2024-05-23 06:15:54,717 - INFO - [262144/436063] Loss : 0.2749 +2024-05-23 06:15:59,804 - INFO - Test loss : 0.3604 +2024-05-23 06:15:59,804 - INFO - +Epoch 47 +2024-05-23 06:16:50,874 - INFO - [204800/436063] Loss : 0.2731 +2024-05-23 06:17:04,516 - INFO - [262144/436063] Loss : 0.2738 +2024-05-23 06:17:10,019 - INFO - Test loss : 0.3727 +2024-05-23 06:17:10,020 - INFO - +Epoch 48 +2024-05-23 06:18:00,901 - INFO - [204800/436063] Loss : 0.2719 +2024-05-23 06:18:15,483 - INFO - [262144/436063] Loss : 0.2728 +2024-05-23 06:18:20,766 - INFO - Test loss : 0.3884 +2024-05-23 06:18:20,766 - INFO - +Epoch 49 +2024-05-23 06:19:12,540 - INFO - [204800/436063] Loss : 0.2716 +2024-05-23 06:19:26,222 - INFO - [262144/436063] Loss : 0.2720 +2024-05-23 06:19:31,131 - INFO - Test loss : 0.3862 +2024-05-23 06:19:31,131 - INFO - +Epoch 50 +2024-05-23 06:20:23,060 - INFO - [204800/436063] Loss : 0.2706 +2024-05-23 06:20:35,436 - INFO - [262144/436063] Loss : 0.2712 +2024-05-23 06:20:40,439 - INFO - Test loss : 0.3867 +2024-05-23 06:20:41,159 - INFO - (24343,) +2024-05-23 06:21:01,712 - INFO - Split ID: 0 +2024-05-23 06:21:01,721 - INFO - Top 1 LocEnc (Epoch 50)acc (%): 3.38 +2024-05-23 06:21:01,722 - INFO - Top 3 LocEnc (Epoch 50)acc (%): 8.13 +2024-05-23 06:21:01,723 - INFO - Top 5 LocEnc (Epoch 50)acc (%): 12.21 +2024-05-23 06:21:01,723 - INFO - Top 10 LocEnc (Epoch 50)acc (%): 19.72 +2024-05-23 06:21:01,833 - INFO - +No prior +2024-05-23 06:21:02,061 - INFO - (24426,) +2024-05-23 06:21:11,740 - INFO - Split ID: 0 +2024-05-23 06:21:11,740 - INFO - Top 1 (Epoch 50)acc (%): 60.2 +2024-05-23 06:21:11,740 - INFO - Top 3 (Epoch 50)acc (%): 77.9 +2024-05-23 06:21:11,740 - INFO - Top 5 (Epoch 50)acc (%): 83.29 +2024-05-23 06:21:11,740 - INFO - Top 10 (Epoch 50)acc (%): 88.5 +2024-05-23 06:22:02,803 - INFO - Split ID: 0 +2024-05-23 06:22:02,803 - INFO - Top 1 (Epoch 50)acc (%): 73.04 +2024-05-23 06:22:02,803 - INFO - Top 3 (Epoch 50)acc (%): 87.08 +2024-05-23 06:22:02,803 - INFO - Top 5 (Epoch 50)acc (%): 90.44 +2024-05-23 06:22:02,803 - INFO - Top 10 (Epoch 50)acc (%): 93.56 +2024-05-23 06:22:02,804 - INFO - +Epoch 51 +2024-05-23 06:22:53,093 - INFO - [204800/436063] Loss : 0.2694 +2024-05-23 06:23:07,158 - INFO - [262144/436063] Loss : 0.2704 +2024-05-23 06:23:12,230 - INFO - Test loss : 0.3792 +2024-05-23 06:23:12,230 - INFO - +Epoch 52 +2024-05-23 06:24:04,508 - INFO - [204800/436063] Loss : 0.2696 +2024-05-23 06:24:17,541 - INFO - [262144/436063] Loss : 0.2703 +2024-05-23 06:24:22,831 - INFO - Test loss : 0.3968 +2024-05-23 06:24:22,832 - INFO - +Epoch 53 +2024-05-23 06:25:16,492 - INFO - [204800/436063] Loss : 0.2682 +2024-05-23 06:25:29,936 - INFO - [262144/436063] Loss : 0.2689 +2024-05-23 06:25:35,018 - INFO - Test loss : 0.3882 +2024-05-23 06:25:35,018 - INFO - +Epoch 54 +2024-05-23 06:26:27,551 - INFO - [204800/436063] Loss : 0.2677 +2024-05-23 06:26:41,266 - INFO - [262144/436063] Loss : 0.2682 +2024-05-23 06:26:46,778 - INFO - Test loss : 0.3907 +2024-05-23 06:26:46,778 - INFO - +Epoch 55 +2024-05-23 06:27:39,069 - INFO - [204800/436063] Loss : 0.2666 +2024-05-23 06:27:52,432 - INFO - [262144/436063] Loss : 0.2675 +2024-05-23 06:27:57,215 - INFO - Test loss : 0.3924 +2024-05-23 06:27:57,215 - INFO - +Epoch 56 +2024-05-23 06:28:49,060 - INFO - [204800/436063] Loss : 0.2661 +2024-05-23 06:29:02,600 - INFO - [262144/436063] Loss : 0.2666 +2024-05-23 06:29:07,507 - INFO - Test loss : 0.4021 +2024-05-23 06:29:07,507 - INFO - +Epoch 57 +2024-05-23 06:29:58,992 - INFO - [204800/436063] Loss : 0.2659 +2024-05-23 06:30:12,026 - INFO - [262144/436063] Loss : 0.2666 +2024-05-23 06:30:17,584 - INFO - Test loss : 0.4034 +2024-05-23 06:30:17,584 - INFO - +Epoch 58 +2024-05-23 06:31:09,088 - INFO - [204800/436063] Loss : 0.2645 +2024-05-23 06:31:19,939 - INFO - [262144/436063] Loss : 0.2651 +2024-05-23 06:31:24,554 - INFO - Test loss : 0.4077 +2024-05-23 06:31:24,555 - INFO - +Epoch 59 +2024-05-23 06:32:09,855 - INFO - [204800/436063] Loss : 0.2649 +2024-05-23 06:32:21,267 - INFO - [262144/436063] Loss : 0.2653 +2024-05-23 06:32:25,757 - INFO - Test loss : 0.4071 +2024-05-23 06:32:25,757 - INFO - +Epoch 60 +2024-05-23 06:33:09,530 - INFO - [204800/436063] Loss : 0.2630 +2024-05-23 06:33:18,870 - INFO - [262144/436063] Loss : 0.2639 +2024-05-23 06:33:24,162 - INFO - Test loss : 0.4045 +2024-05-23 06:33:24,888 - INFO - (24343,) +2024-05-23 06:33:45,470 - INFO - Split ID: 0 +2024-05-23 06:33:45,480 - INFO - Top 1 LocEnc (Epoch 60)acc (%): 3.8 +2024-05-23 06:33:45,481 - INFO - Top 3 LocEnc (Epoch 60)acc (%): 9.0 +2024-05-23 06:33:45,481 - INFO - Top 5 LocEnc (Epoch 60)acc (%): 13.02 +2024-05-23 06:33:45,482 - INFO - Top 10 LocEnc (Epoch 60)acc (%): 20.64 +2024-05-23 06:33:45,590 - INFO - +No prior +2024-05-23 06:33:45,817 - INFO - (24426,) +2024-05-23 06:33:55,501 - INFO - Split ID: 0 +2024-05-23 06:33:55,501 - INFO - Top 1 (Epoch 60)acc (%): 60.2 +2024-05-23 06:33:55,501 - INFO - Top 3 (Epoch 60)acc (%): 77.9 +2024-05-23 06:33:55,501 - INFO - Top 5 (Epoch 60)acc (%): 83.29 +2024-05-23 06:33:55,501 - INFO - Top 10 (Epoch 60)acc (%): 88.5 +2024-05-23 06:34:49,730 - INFO - Split ID: 0 +2024-05-23 06:34:49,730 - INFO - Top 1 (Epoch 60)acc (%): 73.03 +2024-05-23 06:34:49,730 - INFO - Top 3 (Epoch 60)acc (%): 86.94 +2024-05-23 06:34:49,730 - INFO - Top 5 (Epoch 60)acc (%): 90.43 +2024-05-23 06:34:49,730 - INFO - Top 10 (Epoch 60)acc (%): 93.48 +2024-05-23 06:34:49,731 - INFO - +Epoch 61 +2024-05-23 06:35:46,626 - INFO - [204800/436063] Loss : 0.2631 +2024-05-23 06:36:00,787 - INFO - [262144/436063] Loss : 0.2634 +2024-05-23 06:36:06,587 - INFO - Test loss : 0.4228 +2024-05-23 06:36:06,587 - INFO - +Epoch 62 +2024-05-23 06:37:02,842 - INFO - [204800/436063] Loss : 0.2629 +2024-05-23 06:37:17,096 - INFO - [262144/436063] Loss : 0.2632 +2024-05-23 06:37:23,265 - INFO - Test loss : 0.4132 +2024-05-23 06:37:23,265 - INFO - +Epoch 63 +2024-05-23 06:38:19,866 - INFO - [204800/436063] Loss : 0.2614 +2024-05-23 06:38:34,466 - INFO - [262144/436063] Loss : 0.2625 +2024-05-23 06:38:39,981 - INFO - Test loss : 0.4119 +2024-05-23 06:38:39,982 - INFO - +Epoch 64 +2024-05-23 06:39:35,894 - INFO - [204800/436063] Loss : 0.2616 +2024-05-23 06:39:50,468 - INFO - [262144/436063] Loss : 0.2619 +2024-05-23 06:39:56,816 - INFO - Test loss : 0.4061 +2024-05-23 06:39:56,816 - INFO - +Epoch 65 +2024-05-23 06:40:53,580 - INFO - [204800/436063] Loss : 0.2605 +2024-05-23 06:41:08,347 - INFO - [262144/436063] Loss : 0.2612 +2024-05-23 06:41:14,545 - INFO - Test loss : 0.4170 +2024-05-23 06:41:14,545 - INFO - +Epoch 66 +2024-05-23 06:42:10,958 - INFO - [204800/436063] Loss : 0.2601 +2024-05-23 06:42:25,814 - INFO - [262144/436063] Loss : 0.2609 +2024-05-23 06:42:31,983 - INFO - Test loss : 0.4189 +2024-05-23 06:42:31,983 - INFO - +Epoch 67 +2024-05-23 06:43:28,712 - INFO - [204800/436063] Loss : 0.2601 +2024-05-23 06:43:43,225 - INFO - [262144/436063] Loss : 0.2603 +2024-05-23 06:43:49,510 - INFO - Test loss : 0.4320 +2024-05-23 06:43:49,510 - INFO - +Epoch 68 +2024-05-23 06:44:46,252 - INFO - [204800/436063] Loss : 0.2589 +2024-05-23 06:45:00,597 - INFO - [262144/436063] Loss : 0.2599 +2024-05-23 06:45:06,064 - INFO - Test loss : 0.4146 +2024-05-23 06:45:06,064 - INFO - +Epoch 69 +2024-05-23 06:46:02,159 - INFO - [204800/436063] Loss : 0.2591 +2024-05-23 06:46:16,781 - INFO - [262144/436063] Loss : 0.2599 +2024-05-23 06:46:22,734 - INFO - Test loss : 0.4208 +2024-05-23 06:46:22,734 - INFO - +Epoch 70 +2024-05-23 06:47:18,830 - INFO - [204800/436063] Loss : 0.2580 +2024-05-23 06:47:33,147 - INFO - [262144/436063] Loss : 0.2588 +2024-05-23 06:47:39,101 - INFO - Test loss : 0.4272 +2024-05-23 06:47:39,828 - INFO - (24343,) +2024-05-23 06:48:00,474 - INFO - Split ID: 0 +2024-05-23 06:48:00,483 - INFO - Top 1 LocEnc (Epoch 70)acc (%): 4.01 +2024-05-23 06:48:00,484 - INFO - Top 3 LocEnc (Epoch 70)acc (%): 9.23 +2024-05-23 06:48:00,485 - INFO - Top 5 LocEnc (Epoch 70)acc (%): 13.42 +2024-05-23 06:48:00,486 - INFO - Top 10 LocEnc (Epoch 70)acc (%): 21.0 +2024-05-23 06:48:00,596 - INFO - +No prior +2024-05-23 06:48:00,829 - INFO - (24426,) +2024-05-23 06:48:10,518 - INFO - Split ID: 0 +2024-05-23 06:48:10,519 - INFO - Top 1 (Epoch 70)acc (%): 60.2 +2024-05-23 06:48:10,519 - INFO - Top 3 (Epoch 70)acc (%): 77.9 +2024-05-23 06:48:10,519 - INFO - Top 5 (Epoch 70)acc (%): 83.29 +2024-05-23 06:48:10,519 - INFO - Top 10 (Epoch 70)acc (%): 88.5 +2024-05-23 06:49:04,296 - INFO - Split ID: 0 +2024-05-23 06:49:04,297 - INFO - Top 1 (Epoch 70)acc (%): 73.07 +2024-05-23 06:49:04,297 - INFO - Top 3 (Epoch 70)acc (%): 86.84 +2024-05-23 06:49:04,297 - INFO - Top 5 (Epoch 70)acc (%): 90.33 +2024-05-23 06:49:04,297 - INFO - Top 10 (Epoch 70)acc (%): 93.44 +2024-05-23 06:49:04,297 - INFO - +Epoch 71 +2024-05-23 06:50:00,533 - INFO - [204800/436063] Loss : 0.2576 +2024-05-23 06:50:15,481 - INFO - [262144/436063] Loss : 0.2584 +2024-05-23 06:50:21,449 - INFO - Test loss : 0.4294 +2024-05-23 06:50:21,449 - INFO - +Epoch 72 +2024-05-23 06:51:17,146 - INFO - [204800/436063] Loss : 0.2569 +2024-05-23 06:51:31,420 - INFO - [262144/436063] Loss : 0.2576 +2024-05-23 06:51:37,154 - INFO - Test loss : 0.4235 +2024-05-23 06:51:37,154 - INFO - +Epoch 73 +2024-05-23 06:52:34,067 - INFO - [204800/436063] Loss : 0.2566 +2024-05-23 06:52:48,995 - INFO - [262144/436063] Loss : 0.2572 +2024-05-23 06:52:55,047 - INFO - Test loss : 0.4322 +2024-05-23 06:52:55,048 - INFO - +Epoch 74 +2024-05-23 06:53:51,495 - INFO - [204800/436063] Loss : 0.2565 +2024-05-23 06:54:06,498 - INFO - [262144/436063] Loss : 0.2572 +2024-05-23 06:54:12,448 - INFO - Test loss : 0.4395 +2024-05-23 06:54:12,448 - INFO - +Epoch 75 +2024-05-23 06:55:07,139 - INFO - [204800/436063] Loss : 0.2557 +2024-05-23 06:55:20,684 - INFO - [262144/436063] Loss : 0.2560 +2024-05-23 06:55:25,730 - INFO - Test loss : 0.4454 +2024-05-23 06:55:25,730 - INFO - +Epoch 76 +2024-05-23 06:56:15,942 - INFO - [204800/436063] Loss : 0.2554 +2024-05-23 06:56:29,938 - INFO - [262144/436063] Loss : 0.2559 +2024-05-23 06:56:35,450 - INFO - Test loss : 0.4328 +2024-05-23 06:56:35,450 - INFO - +Epoch 77 +2024-05-23 06:57:26,005 - INFO - [204800/436063] Loss : 0.2554 +2024-05-23 06:57:39,441 - INFO - [262144/436063] Loss : 0.2557 +2024-05-23 06:57:44,408 - INFO - Test loss : 0.4421 +2024-05-23 06:57:44,408 - INFO - +Epoch 78 +2024-05-23 06:58:36,540 - INFO - [204800/436063] Loss : 0.2546 +2024-05-23 06:58:49,836 - INFO - [262144/436063] Loss : 0.2550 +2024-05-23 06:58:55,015 - INFO - Test loss : 0.4456 +2024-05-23 06:58:55,015 - INFO - +Epoch 79 +2024-05-23 06:59:44,554 - INFO - [204800/436063] Loss : 0.2541 +2024-05-23 06:59:58,128 - INFO - [262144/436063] Loss : 0.2546 +2024-05-23 07:00:03,709 - INFO - Test loss : 0.4569 +2024-05-23 07:00:03,710 - INFO - +Epoch 80 +2024-05-23 07:00:54,351 - INFO - [204800/436063] Loss : 0.2538 +2024-05-23 07:01:07,608 - INFO - [262144/436063] Loss : 0.2544 +2024-05-23 07:01:12,587 - INFO - Test loss : 0.4421 +2024-05-23 07:01:13,322 - INFO - (24343,) +2024-05-23 07:01:33,908 - INFO - Split ID: 0 +2024-05-23 07:01:33,917 - INFO - Top 1 LocEnc (Epoch 80)acc (%): 4.04 +2024-05-23 07:01:33,918 - INFO - Top 3 LocEnc (Epoch 80)acc (%): 9.54 +2024-05-23 07:01:33,919 - INFO - Top 5 LocEnc (Epoch 80)acc (%): 13.7 +2024-05-23 07:01:33,919 - INFO - Top 10 LocEnc (Epoch 80)acc (%): 21.68 +2024-05-23 07:01:34,020 - INFO - +No prior +2024-05-23 07:01:34,239 - INFO - (24426,) +2024-05-23 07:01:43,886 - INFO - Split ID: 0 +2024-05-23 07:01:43,886 - INFO - Top 1 (Epoch 80)acc (%): 60.2 +2024-05-23 07:01:43,886 - INFO - Top 3 (Epoch 80)acc (%): 77.9 +2024-05-23 07:01:43,886 - INFO - Top 5 (Epoch 80)acc (%): 83.29 +2024-05-23 07:01:43,886 - INFO - Top 10 (Epoch 80)acc (%): 88.5 +2024-05-23 07:02:32,911 - INFO - Split ID: 0 +2024-05-23 07:02:32,911 - INFO - Top 1 (Epoch 80)acc (%): 72.99 +2024-05-23 07:02:32,911 - INFO - Top 3 (Epoch 80)acc (%): 86.81 +2024-05-23 07:02:32,912 - INFO - Top 5 (Epoch 80)acc (%): 90.26 +2024-05-23 07:02:32,912 - INFO - Top 10 (Epoch 80)acc (%): 93.36 +2024-05-23 07:02:32,912 - INFO - +Epoch 81 +2024-05-23 07:03:22,695 - INFO - [204800/436063] Loss : 0.2525 +2024-05-23 07:03:36,000 - INFO - [262144/436063] Loss : 0.2537 +2024-05-23 07:03:42,014 - INFO - Test loss : 0.4454 +2024-05-23 07:03:42,014 - INFO - +Epoch 82 +2024-05-23 07:04:34,854 - INFO - [204800/436063] Loss : 0.2530 +2024-05-23 07:04:48,218 - INFO - [262144/436063] Loss : 0.2535 +2024-05-23 07:04:53,470 - INFO - Test loss : 0.4588 +2024-05-23 07:04:53,470 - INFO - +Epoch 83 +2024-05-23 07:05:43,640 - INFO - [204800/436063] Loss : 0.2527 +2024-05-23 07:05:56,787 - INFO - [262144/436063] Loss : 0.2532 +2024-05-23 07:06:02,085 - INFO - Test loss : 0.4628 +2024-05-23 07:06:02,086 - INFO - +Epoch 84 +2024-05-23 07:06:53,419 - INFO - [204800/436063] Loss : 0.2525 +2024-05-23 07:07:06,761 - INFO - [262144/436063] Loss : 0.2531 +2024-05-23 07:07:12,011 - INFO - Test loss : 0.4577 +2024-05-23 07:07:12,011 - INFO - +Epoch 85 +2024-05-23 07:08:02,864 - INFO - [204800/436063] Loss : 0.2519 +2024-05-23 07:08:11,904 - INFO - [262144/436063] Loss : 0.2523 +2024-05-23 07:08:15,306 - INFO - Test loss : 0.4559 +2024-05-23 07:08:15,306 - INFO - +Epoch 86 +2024-05-23 07:08:50,465 - INFO - [204800/436063] Loss : 0.2519 +2024-05-23 07:08:58,651 - INFO - [262144/436063] Loss : 0.2523 +2024-05-23 07:09:01,792 - INFO - Test loss : 0.4605 +2024-05-23 07:09:01,792 - INFO - +Epoch 87 +2024-05-23 07:09:17,385 - INFO - [204800/436063] Loss : 0.2508 +2024-05-23 07:09:20,873 - INFO - [262144/436063] Loss : 0.2511 +2024-05-23 07:09:22,203 - INFO - Test loss : 0.4714 +2024-05-23 07:09:22,203 - INFO - +Epoch 88 +2024-05-23 07:09:35,638 - INFO - [204800/436063] Loss : 0.2510 +2024-05-23 07:09:38,975 - INFO - [262144/436063] Loss : 0.2510 +2024-05-23 07:09:40,318 - INFO - Test loss : 0.4641 +2024-05-23 07:09:40,319 - INFO - +Epoch 89 +2024-05-23 07:09:53,814 - INFO - [204800/436063] Loss : 0.2504 +2024-05-23 07:09:57,419 - INFO - [262144/436063] Loss : 0.2501 +2024-05-23 07:09:58,835 - INFO - Test loss : 0.4725 +2024-05-23 07:09:58,836 - INFO - +Epoch 90 +2024-05-23 07:10:13,022 - INFO - [204800/436063] Loss : 0.2501 +2024-05-23 07:10:16,713 - INFO - [262144/436063] Loss : 0.2504 +2024-05-23 07:10:17,971 - INFO - Test loss : 0.4616 +2024-05-23 07:10:18,570 - INFO - (24343,) +2024-05-23 07:10:39,074 - INFO - Split ID: 0 +2024-05-23 07:10:39,083 - INFO - Top 1 LocEnc (Epoch 90)acc (%): 4.31 +2024-05-23 07:10:39,084 - INFO - Top 3 LocEnc (Epoch 90)acc (%): 9.96 +2024-05-23 07:10:39,085 - INFO - Top 5 LocEnc (Epoch 90)acc (%): 13.92 +2024-05-23 07:10:39,086 - INFO - Top 10 LocEnc (Epoch 90)acc (%): 22.02 +2024-05-23 07:10:39,196 - INFO - +No prior +2024-05-23 07:10:39,430 - INFO - (24426,) +2024-05-23 07:10:49,042 - INFO - Split ID: 0 +2024-05-23 07:10:49,043 - INFO - Top 1 (Epoch 90)acc (%): 60.2 +2024-05-23 07:10:49,043 - INFO - Top 3 (Epoch 90)acc (%): 77.9 +2024-05-23 07:10:49,043 - INFO - Top 5 (Epoch 90)acc (%): 83.29 +2024-05-23 07:10:49,043 - INFO - Top 10 (Epoch 90)acc (%): 88.5 +2024-05-23 07:11:32,512 - INFO - Split ID: 0 +2024-05-23 07:11:32,512 - INFO - Top 1 (Epoch 90)acc (%): 72.99 +2024-05-23 07:11:32,512 - INFO - Top 3 (Epoch 90)acc (%): 86.67 +2024-05-23 07:11:32,512 - INFO - Top 5 (Epoch 90)acc (%): 90.17 +2024-05-23 07:11:32,512 - INFO - Top 10 (Epoch 90)acc (%): 93.28 +2024-05-23 07:11:32,513 - INFO - +Epoch 91 +2024-05-23 07:12:15,838 - INFO - [204800/436063] Loss : 0.2499 +2024-05-23 07:12:26,927 - INFO - [262144/436063] Loss : 0.2505 +2024-05-23 07:12:32,149 - INFO - Test loss : 0.4673 +2024-05-23 07:12:32,149 - INFO - +Epoch 92 +2024-05-23 07:13:04,308 - INFO - [204800/436063] Loss : 0.2490 +2024-05-23 07:13:13,509 - INFO - [262144/436063] Loss : 0.2495 +2024-05-23 07:13:16,587 - INFO - Test loss : 0.4695 +2024-05-23 07:13:16,588 - INFO - +Epoch 93 +2024-05-23 07:13:49,616 - INFO - [204800/436063] Loss : 0.2497 +2024-05-23 07:13:59,025 - INFO - [262144/436063] Loss : 0.2500 +2024-05-23 07:14:02,123 - INFO - Test loss : 0.4727 +2024-05-23 07:14:02,123 - INFO - +Epoch 94 +2024-05-23 07:14:35,299 - INFO - [204800/436063] Loss : 0.2491 +2024-05-23 07:14:44,186 - INFO - [262144/436063] Loss : 0.2499 +2024-05-23 07:14:47,777 - INFO - Test loss : 0.4638 +2024-05-23 07:14:47,778 - INFO - +Epoch 95 +2024-05-23 07:15:20,577 - INFO - [204800/436063] Loss : 0.2486 +2024-05-23 07:15:29,120 - INFO - [262144/436063] Loss : 0.2488 +2024-05-23 07:15:32,252 - INFO - Test loss : 0.4817 +2024-05-23 07:15:32,252 - INFO - +Epoch 96 +2024-05-23 07:16:04,422 - INFO - [204800/436063] Loss : 0.2482 +2024-05-23 07:16:13,100 - INFO - [262144/436063] Loss : 0.2486 +2024-05-23 07:16:17,039 - INFO - Test loss : 0.4744 +2024-05-23 07:16:17,039 - INFO - +Epoch 97 +2024-05-23 07:16:49,608 - INFO - [204800/436063] Loss : 0.2484 +2024-05-23 07:16:58,536 - INFO - [262144/436063] Loss : 0.2485 +2024-05-23 07:17:02,159 - INFO - Test loss : 0.4877 +2024-05-23 07:17:02,159 - INFO - +Epoch 98 +2024-05-23 07:17:35,470 - INFO - [204800/436063] Loss : 0.2477 +2024-05-23 07:17:43,250 - INFO - [262144/436063] Loss : 0.2480 +2024-05-23 07:17:46,796 - INFO - Test loss : 0.4879 +2024-05-23 07:17:46,797 - INFO - +Epoch 99 +2024-05-23 07:18:20,085 - INFO - [204800/436063] Loss : 0.2475 +2024-05-23 07:18:29,349 - INFO - [262144/436063] Loss : 0.2475 +2024-05-23 07:18:32,452 - INFO - Test loss : 0.4915 +2024-05-23 07:18:32,452 - INFO - Saving output model to ../models/space2vec_grid/model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-23 07:18:32,516 - INFO - Saving output model to ../models/space2vec_grid/model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-23 07:18:32,789 - INFO - +No prior +2024-05-23 07:18:33,022 - INFO - (24426,) +2024-05-23 07:18:42,651 - INFO - Split ID: 0 +2024-05-23 07:18:42,651 - INFO - Top 1 acc (%): 60.2 +2024-05-23 07:18:42,651 - INFO - Top 3 acc (%): 77.9 +2024-05-23 07:18:42,651 - INFO - Top 5 acc (%): 83.29 +2024-05-23 07:18:42,651 - INFO - Top 10 acc (%): 88.5 +2024-05-23 07:19:17,513 - INFO - Split ID: 0 +2024-05-23 07:19:17,514 - INFO - Top 1 acc (%): 73.06 +2024-05-23 07:19:17,514 - INFO - Top 3 acc (%): 86.58 +2024-05-23 07:19:17,514 - INFO - Top 5 acc (%): 90.1 +2024-05-23 07:19:17,514 - INFO - Top 10 acc (%): 93.15 +2024-05-23 07:19:17,515 - INFO - +Space2Vec-grid +2024-05-23 07:19:17,515 - INFO - Model : model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-23 07:19:18,266 - INFO - (24343,) +2024-05-23 07:19:38,749 - INFO - Split ID: 0 +2024-05-23 07:19:38,758 - INFO - Top 1 LocEnc acc (%): 4.35 +2024-05-23 07:19:38,759 - INFO - Top 3 LocEnc acc (%): 10.05 +2024-05-23 07:19:38,760 - INFO - Top 5 LocEnc acc (%): 14.25 +2024-05-23 07:19:38,761 - INFO - Top 10 LocEnc acc (%): 22.39 +2024-05-31 01:45:34,777 - INFO - +num_classes 8142 +2024-05-31 01:45:34,777 - INFO - num train 436063 +2024-05-31 01:45:34,777 - INFO - num val 24343 +2024-05-31 01:45:34,777 - INFO - train loss full_loss +2024-05-31 01:45:34,777 - INFO - model name ../models/space2vec_grid/model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:45:34,777 - INFO - num users 18643 +2024-05-31 01:45:35,668 - INFO - +Only Space2Vec-grid +2024-05-31 01:45:35,668 - INFO - Model : model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:45:35,918 - INFO - Saving output model to ../models/space2vec_grid/model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:45:36,175 - INFO - Saving output model to ../models/space2vec_grid/model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:45:36,483 - INFO - +No prior +2024-05-31 01:45:36,739 - INFO - (24426,) +2024-05-31 01:45:53,930 - INFO - Save results to ../eval_results/eval_inat_2018__val_no_prior.csv +2024-05-31 01:45:53,931 - INFO - Split ID: 0 +2024-05-31 01:45:53,931 - INFO - Top 1 acc (%): 60.2 +2024-05-31 01:45:53,931 - INFO - Top 3 acc (%): 77.9 +2024-05-31 01:45:53,931 - INFO - Top 5 acc (%): 83.29 +2024-05-31 01:45:53,931 - INFO - Top 10 acc (%): 88.5 +2024-05-31 01:46:19,149 - INFO - Split ID: 0 +2024-05-31 01:46:19,149 - INFO - Top 1 hit (%): 73.06 +2024-05-31 01:46:19,149 - INFO - Top 3 hit (%): 86.58 +2024-05-31 01:46:19,149 - INFO - Top 5 hit (%): 90.1 +2024-05-31 01:46:19,149 - INFO - Top 10 hit (%): 93.15 +2024-05-31 01:46:19,156 - INFO - +Only Space2Vec-grid +2024-05-31 01:46:19,156 - INFO - Model : model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:46:19,816 - INFO - (24343,) +2024-05-31 01:46:39,986 - INFO - Split ID: 0 +2024-05-31 01:46:39,995 - INFO - Top 1 LocEnc acc (%): 4.35 +2024-05-31 01:46:39,996 - INFO - Top 3 LocEnc acc (%): 10.05 +2024-05-31 01:46:39,997 - INFO - Top 5 LocEnc acc (%): 14.25 +2024-05-31 01:46:39,997 - INFO - Top 10 LocEnc acc (%): 22.39 diff --git a/pre_trained_models/space2vec_grid/model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/space2vec_grid/model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..9e97c16e Binary files /dev/null and b/pre_trained_models/space2vec_grid/model_inat_2018_Space2Vec-grid_0.0100_32_0.0001000_360.000_1_512_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_grid/model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.log b/pre_trained_models/space2vec_grid/model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..ac388bc3 --- /dev/null +++ b/pre_trained_models/space2vec_grid/model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.log @@ -0,0 +1,226 @@ +2024-05-21 09:18:18,726 - INFO - +num_classes 555 +2024-05-21 09:18:18,726 - INFO - num train 22599 +2024-05-21 09:18:18,726 - INFO - num val 1100 +2024-05-21 09:18:18,726 - INFO - train loss full_loss +2024-05-21 09:18:18,726 - INFO - model name ../models/space2vec_grid/model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-21 09:18:18,726 - INFO - num users 5331 +2024-05-21 09:18:18,726 - INFO - meta data ebird_meta +2024-05-21 09:18:19,457 - INFO - +Epoch 0 +2024-05-21 09:18:41,868 - INFO - [20480/22599] Loss : 1.7161 +2024-05-21 09:18:42,918 - INFO - Test loss : 0.3869 +2024-05-21 09:18:42,918 - INFO - +Epoch 1 +2024-05-21 09:19:04,808 - INFO - [20480/22599] Loss : 1.1914 +2024-05-21 09:19:05,815 - INFO - Test loss : 0.3971 +2024-05-21 09:19:05,815 - INFO - +Epoch 2 +2024-05-21 09:19:27,707 - INFO - [20480/22599] Loss : 1.0732 +2024-05-21 09:19:28,749 - INFO - Test loss : 0.3502 +2024-05-21 09:19:28,749 - INFO - +Epoch 3 +2024-05-21 09:19:50,608 - INFO - [20480/22599] Loss : 1.0194 +2024-05-21 09:19:51,632 - INFO - Test loss : 0.3984 +2024-05-21 09:19:51,632 - INFO - +Epoch 4 +2024-05-21 09:20:13,494 - INFO - [20480/22599] Loss : 0.9826 +2024-05-21 09:20:14,547 - INFO - Test loss : 0.3830 +2024-05-21 09:20:14,548 - INFO - +Epoch 5 +2024-05-21 09:20:36,401 - INFO - [20480/22599] Loss : 0.9506 +2024-05-21 09:20:37,451 - INFO - Test loss : 0.3845 +2024-05-21 09:20:37,451 - INFO - +Epoch 6 +2024-05-21 09:20:59,292 - INFO - [20480/22599] Loss : 0.9272 +2024-05-21 09:21:00,337 - INFO - Test loss : 0.3876 +2024-05-21 09:21:00,337 - INFO - +Epoch 7 +2024-05-21 09:21:20,440 - INFO - [20480/22599] Loss : 0.9109 +2024-05-21 09:21:21,508 - INFO - Test loss : 0.4047 +2024-05-21 09:21:21,509 - INFO - +Epoch 8 +2024-05-21 09:21:43,315 - INFO - [20480/22599] Loss : 0.8911 +2024-05-21 09:21:44,363 - INFO - Test loss : 0.4561 +2024-05-21 09:21:44,363 - INFO - +Epoch 9 +2024-05-21 09:22:06,164 - INFO - [20480/22599] Loss : 0.8816 +2024-05-21 09:22:07,231 - INFO - Test loss : 0.4558 +2024-05-21 09:22:07,231 - INFO - +Epoch 10 +2024-05-21 09:22:28,952 - INFO - [20480/22599] Loss : 0.8664 +2024-05-21 09:22:30,020 - INFO - Test loss : 0.4825 +2024-05-21 09:22:30,034 - INFO - (1100,) +2024-05-21 09:22:30,077 - INFO - Split ID: 0 +2024-05-21 09:22:30,077 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 1.73 +2024-05-21 09:22:30,077 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 6.09 +2024-05-21 09:22:30,077 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 9.82 +2024-05-21 09:22:30,077 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 16.73 +2024-05-21 09:22:30,078 - INFO - +No prior +2024-05-21 09:22:30,105 - INFO - (24633,) +2024-05-21 09:22:30,761 - INFO - Split ID: 0 +2024-05-21 09:22:30,761 - INFO - Top 1 (Epoch 10)acc (%): 76.08 +2024-05-21 09:22:30,761 - INFO - Top 3 (Epoch 10)acc (%): 90.98 +2024-05-21 09:22:30,761 - INFO - Top 5 (Epoch 10)acc (%): 94.06 +2024-05-21 09:22:30,761 - INFO - Top 10 (Epoch 10)acc (%): 96.83 +2024-05-21 09:23:02,193 - INFO - Split ID: 0 +2024-05-21 09:23:02,193 - INFO - Top 1 (Epoch 10)acc (%): 81.44 +2024-05-21 09:23:02,193 - INFO - Top 3 (Epoch 10)acc (%): 93.51 +2024-05-21 09:23:02,193 - INFO - Top 5 (Epoch 10)acc (%): 95.89 +2024-05-21 09:23:02,194 - INFO - Top 10 (Epoch 10)acc (%): 97.92 +2024-05-21 09:23:02,198 - INFO - +Epoch 11 +2024-05-21 09:23:23,994 - INFO - [20480/22599] Loss : 0.8604 +2024-05-21 09:23:25,048 - INFO - Test loss : 0.4877 +2024-05-21 09:23:25,048 - INFO - +Epoch 12 +2024-05-21 09:23:46,755 - INFO - [20480/22599] Loss : 0.8509 +2024-05-21 09:23:47,822 - INFO - Test loss : 0.4918 +2024-05-21 09:23:47,822 - INFO - +Epoch 13 +2024-05-21 09:24:09,634 - INFO - [20480/22599] Loss : 0.8399 +2024-05-21 09:24:10,662 - INFO - Test loss : 0.5271 +2024-05-21 09:24:10,662 - INFO - +Epoch 14 +2024-05-21 09:24:32,417 - INFO - [20480/22599] Loss : 0.8297 +2024-05-21 09:24:33,485 - INFO - Test loss : 0.5376 +2024-05-21 09:24:33,485 - INFO - +Epoch 15 +2024-05-21 09:24:55,260 - INFO - [20480/22599] Loss : 0.8243 +2024-05-21 09:24:56,299 - INFO - Test loss : 0.5383 +2024-05-21 09:24:56,299 - INFO - +Epoch 16 +2024-05-21 09:25:18,078 - INFO - [20480/22599] Loss : 0.8188 +2024-05-21 09:25:19,153 - INFO - Test loss : 0.5524 +2024-05-21 09:25:19,153 - INFO - +Epoch 17 +2024-05-21 09:25:40,966 - INFO - [20480/22599] Loss : 0.8115 +2024-05-21 09:25:42,000 - INFO - Test loss : 0.5983 +2024-05-21 09:25:42,000 - INFO - +Epoch 18 +2024-05-21 09:26:03,888 - INFO - [20480/22599] Loss : 0.8063 +2024-05-21 09:26:04,926 - INFO - Test loss : 0.6149 +2024-05-21 09:26:04,926 - INFO - +Epoch 19 +2024-05-21 09:26:26,734 - INFO - [20480/22599] Loss : 0.7993 +2024-05-21 09:26:27,796 - INFO - Test loss : 0.5862 +2024-05-21 09:26:27,796 - INFO - +Epoch 20 +2024-05-21 09:26:49,962 - INFO - [20480/22599] Loss : 0.7960 +2024-05-21 09:26:51,051 - INFO - Test loss : 0.6216 +2024-05-21 09:26:51,059 - INFO - (1100,) +2024-05-21 09:26:51,103 - INFO - Split ID: 0 +2024-05-21 09:26:51,103 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 3.0 +2024-05-21 09:26:51,103 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 8.36 +2024-05-21 09:26:51,103 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 11.55 +2024-05-21 09:26:51,104 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 17.45 +2024-05-21 09:26:51,104 - INFO - +No prior +2024-05-21 09:26:51,130 - INFO - (24633,) +2024-05-21 09:26:51,786 - INFO - Split ID: 0 +2024-05-21 09:26:51,786 - INFO - Top 1 (Epoch 20)acc (%): 76.08 +2024-05-21 09:26:51,786 - INFO - Top 3 (Epoch 20)acc (%): 90.98 +2024-05-21 09:26:51,786 - INFO - Top 5 (Epoch 20)acc (%): 94.06 +2024-05-21 09:26:51,786 - INFO - Top 10 (Epoch 20)acc (%): 96.83 +2024-05-21 09:27:21,953 - INFO - Split ID: 0 +2024-05-21 09:27:21,953 - INFO - Top 1 (Epoch 20)acc (%): 81.74 +2024-05-21 09:27:21,953 - INFO - Top 3 (Epoch 20)acc (%): 93.49 +2024-05-21 09:27:21,953 - INFO - Top 5 (Epoch 20)acc (%): 95.81 +2024-05-21 09:27:21,953 - INFO - Top 10 (Epoch 20)acc (%): 97.82 +2024-05-21 09:27:21,954 - INFO - +Epoch 21 +2024-05-21 09:27:43,661 - INFO - [20480/22599] Loss : 0.7901 +2024-05-21 09:27:44,607 - INFO - Test loss : 0.6409 +2024-05-21 09:27:44,607 - INFO - +Epoch 22 +2024-05-21 09:28:06,018 - INFO - [20480/22599] Loss : 0.7886 +2024-05-21 09:28:07,086 - INFO - Test loss : 0.6386 +2024-05-21 09:28:07,086 - INFO - +Epoch 23 +2024-05-21 09:28:26,476 - INFO - [20480/22599] Loss : 0.7830 +2024-05-21 09:28:27,262 - INFO - Test loss : 0.6371 +2024-05-21 09:28:27,262 - INFO - +Epoch 24 +2024-05-21 09:28:43,263 - INFO - [20480/22599] Loss : 0.7789 +2024-05-21 09:28:44,063 - INFO - Test loss : 0.6756 +2024-05-21 09:28:44,063 - INFO - +Epoch 25 +2024-05-21 09:29:01,903 - INFO - [20480/22599] Loss : 0.7718 +2024-05-21 09:29:02,924 - INFO - Test loss : 0.6670 +2024-05-21 09:29:02,924 - INFO - +Epoch 26 +2024-05-21 09:29:24,519 - INFO - [20480/22599] Loss : 0.7717 +2024-05-21 09:29:25,552 - INFO - Test loss : 0.6729 +2024-05-21 09:29:25,552 - INFO - +Epoch 27 +2024-05-21 09:29:46,957 - INFO - [20480/22599] Loss : 0.7699 +2024-05-21 09:29:48,024 - INFO - Test loss : 0.7193 +2024-05-21 09:29:48,024 - INFO - +Epoch 28 +2024-05-21 09:30:09,703 - INFO - [20480/22599] Loss : 0.7630 +2024-05-21 09:30:10,761 - INFO - Test loss : 0.6942 +2024-05-21 09:30:10,761 - INFO - +Epoch 29 +2024-05-21 09:30:32,501 - INFO - [20480/22599] Loss : 0.7574 +2024-05-21 09:30:33,568 - INFO - Test loss : 0.7270 +2024-05-21 09:30:33,568 - INFO - Saving output model to ../models/space2vec_grid/model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-21 09:30:33,581 - INFO - Saving output model to ../models/space2vec_grid/model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-21 09:30:33,608 - INFO - +No prior +2024-05-21 09:30:33,635 - INFO - (24633,) +2024-05-21 09:30:34,294 - INFO - Split ID: 0 +2024-05-21 09:30:34,294 - INFO - Top 1 acc (%): 76.08 +2024-05-21 09:30:34,294 - INFO - Top 3 acc (%): 90.98 +2024-05-21 09:30:34,294 - INFO - Top 5 acc (%): 94.06 +2024-05-21 09:30:34,294 - INFO - Top 10 acc (%): 96.83 +2024-05-21 09:31:04,202 - INFO - Split ID: 0 +2024-05-21 09:31:04,202 - INFO - Top 1 acc (%): 81.7 +2024-05-21 09:31:04,202 - INFO - Top 3 acc (%): 93.43 +2024-05-21 09:31:04,202 - INFO - Top 5 acc (%): 95.76 +2024-05-21 09:31:04,203 - INFO - Top 10 acc (%): 97.7 +2024-05-21 09:31:04,203 - INFO - +Space2Vec-grid +2024-05-21 09:31:04,203 - INFO - Model : model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-21 09:31:04,220 - INFO - (1100,) +2024-05-21 09:31:04,265 - INFO - Split ID: 0 +2024-05-21 09:31:04,265 - INFO - Top 1 LocEnc acc (%): 2.18 +2024-05-21 09:31:04,265 - INFO - Top 3 LocEnc acc (%): 7.73 +2024-05-21 09:31:04,265 - INFO - Top 5 LocEnc acc (%): 10.91 +2024-05-21 09:31:04,266 - INFO - Top 10 LocEnc acc (%): 17.91 +2024-05-31 01:41:57,132 - INFO - +num_classes 555 +2024-05-31 01:41:57,132 - INFO - num train 22599 +2024-05-31 01:41:57,132 - INFO - num val 1100 +2024-05-31 01:41:57,132 - INFO - train loss full_loss +2024-05-31 01:41:57,132 - INFO - model name ../models/space2vec_grid/model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:41:57,133 - INFO - num users 5331 +2024-05-31 01:41:57,133 - INFO - meta data ebird_meta +2024-05-31 01:41:57,888 - INFO - +Only Space2Vec-grid +2024-05-31 01:41:57,888 - INFO - Model : model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:41:57,986 - INFO - Saving output model to ../models/space2vec_grid/model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:41:58,028 - INFO - Saving output model to ../models/space2vec_grid/model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:41:58,071 - INFO - +No prior +2024-05-31 01:41:58,102 - INFO - (24633,) +2024-05-31 01:41:59,645 - INFO - Save results to ../eval_results/eval_nabirds_ebird_meta_test_no_prior.csv +2024-05-31 01:41:59,646 - INFO - Split ID: 0 +2024-05-31 01:41:59,646 - INFO - Top 1 acc (%): 76.08 +2024-05-31 01:41:59,646 - INFO - Top 3 acc (%): 90.98 +2024-05-31 01:41:59,646 - INFO - Top 5 acc (%): 94.06 +2024-05-31 01:41:59,646 - INFO - Top 10 acc (%): 96.83 +2024-05-31 01:42:11,714 - INFO - Split ID: 0 +2024-05-31 01:42:11,714 - INFO - Top 1 hit (%): 81.7 +2024-05-31 01:42:11,714 - INFO - Top 3 hit (%): 93.43 +2024-05-31 01:42:11,714 - INFO - Top 5 hit (%): 95.76 +2024-05-31 01:42:11,714 - INFO - Top 10 hit (%): 97.7 +2024-05-31 01:42:11,720 - INFO - +Only Space2Vec-grid +2024-05-31 01:42:11,720 - INFO - Model : model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:42:11,756 - INFO - (1100,) +2024-05-31 01:42:11,800 - INFO - Split ID: 0 +2024-05-31 01:42:11,801 - INFO - Top 1 LocEnc acc (%): 2.18 +2024-05-31 01:42:11,801 - INFO - Top 3 LocEnc acc (%): 7.73 +2024-05-31 01:42:11,801 - INFO - Top 5 LocEnc acc (%): 10.91 +2024-05-31 01:42:11,801 - INFO - Top 10 LocEnc acc (%): 17.91 diff --git a/pre_trained_models/space2vec_grid/model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/space2vec_grid/model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..c97e71d4 Binary files /dev/null and b/pre_trained_models/space2vec_grid/model_nabirds_ebird_meta_Space2Vec-grid_inception_v3_0.0100_32_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_grid/model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.log b/pre_trained_models/space2vec_grid/model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..57ebf11c --- /dev/null +++ b/pre_trained_models/space2vec_grid/model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.log @@ -0,0 +1,224 @@ +2024-05-21 12:19:56,600 - INFO - +num_classes 100 +2024-05-21 12:19:56,600 - INFO - num train 66739 +2024-05-21 12:19:56,600 - INFO - num val 4449 +2024-05-21 12:19:56,600 - INFO - train loss full_loss +2024-05-21 12:19:56,600 - INFO - model name ../models/space2vec_grid/model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-21 12:19:56,600 - INFO - num users 1 +2024-05-21 12:19:57,349 - INFO - +Epoch 0 +2024-05-21 12:20:50,341 - INFO - [65536/66739] Loss : 1.4554 +2024-05-21 12:20:53,959 - INFO - Test loss : 0.6006 +2024-05-21 12:20:53,959 - INFO - +Epoch 1 +2024-05-21 12:21:47,385 - INFO - [65536/66739] Loss : 1.1751 +2024-05-21 12:21:50,867 - INFO - Test loss : 0.4641 +2024-05-21 12:21:50,867 - INFO - +Epoch 2 +2024-05-21 12:22:44,379 - INFO - [65536/66739] Loss : 1.1226 +2024-05-21 12:22:47,876 - INFO - Test loss : 0.4703 +2024-05-21 12:22:47,876 - INFO - +Epoch 3 +2024-05-21 12:23:40,661 - INFO - [65536/66739] Loss : 1.0947 +2024-05-21 12:23:44,493 - INFO - Test loss : 0.4641 +2024-05-21 12:23:44,493 - INFO - +Epoch 4 +2024-05-21 12:24:42,480 - INFO - [65536/66739] Loss : 1.0762 +2024-05-21 12:24:46,574 - INFO - Test loss : 0.4760 +2024-05-21 12:24:46,574 - INFO - +Epoch 5 +2024-05-21 12:25:32,209 - INFO - [65536/66739] Loss : 1.0617 +2024-05-21 12:25:35,303 - INFO - Test loss : 0.4428 +2024-05-21 12:25:35,303 - INFO - +Epoch 6 +2024-05-21 12:26:06,520 - INFO - [65536/66739] Loss : 1.0493 +2024-05-21 12:26:07,802 - INFO - Test loss : 0.4567 +2024-05-21 12:26:07,802 - INFO - +Epoch 7 +2024-05-21 12:26:43,873 - INFO - [65536/66739] Loss : 1.0404 +2024-05-21 12:26:47,035 - INFO - Test loss : 0.4446 +2024-05-21 12:26:47,035 - INFO - +Epoch 8 +2024-05-21 12:27:39,338 - INFO - [65536/66739] Loss : 1.0340 +2024-05-21 12:27:43,123 - INFO - Test loss : 0.4419 +2024-05-21 12:27:43,123 - INFO - +Epoch 9 +2024-05-21 12:28:32,846 - INFO - [65536/66739] Loss : 1.0265 +2024-05-21 12:28:36,421 - INFO - Test loss : 0.4487 +2024-05-21 12:28:36,421 - INFO - +Epoch 10 +2024-05-21 12:29:29,832 - INFO - [65536/66739] Loss : 1.0197 +2024-05-21 12:29:33,569 - INFO - Test loss : 0.4302 +2024-05-21 12:29:33,606 - INFO - (4449,) +2024-05-21 12:29:33,630 - INFO - Split ID: 0 +2024-05-21 12:29:33,632 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 11.98 +2024-05-21 12:29:33,632 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 25.29 +2024-05-21 12:29:33,632 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 33.83 +2024-05-21 12:29:33,632 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 48.53 +2024-05-21 12:29:33,633 - INFO - +No prior +2024-05-21 12:29:33,635 - INFO - (17798,) +2024-05-21 12:29:33,728 - INFO - Split ID: 0 +2024-05-21 12:29:33,729 - INFO - Top 1 (Epoch 10)acc (%): 50.15 +2024-05-21 12:29:33,729 - INFO - Top 3 (Epoch 10)acc (%): 73.9 +2024-05-21 12:29:33,729 - INFO - Top 5 (Epoch 10)acc (%): 82.45 +2024-05-21 12:29:33,729 - INFO - Top 10 (Epoch 10)acc (%): 91.06 +2024-05-21 12:29:52,838 - INFO - Split ID: 0 +2024-05-21 12:29:52,838 - INFO - Top 1 (Epoch 10)acc (%): 51.05 +2024-05-21 12:29:52,838 - INFO - Top 3 (Epoch 10)acc (%): 75.28 +2024-05-21 12:29:52,838 - INFO - Top 5 (Epoch 10)acc (%): 83.87 +2024-05-21 12:29:52,838 - INFO - Top 10 (Epoch 10)acc (%): 92.15 +2024-05-21 12:29:52,838 - INFO - +Epoch 11 +2024-05-21 12:30:46,197 - INFO - [65536/66739] Loss : 1.0141 +2024-05-21 12:30:48,979 - INFO - Test loss : 0.4427 +2024-05-21 12:30:48,980 - INFO - +Epoch 12 +2024-05-21 12:31:47,937 - INFO - [65536/66739] Loss : 1.0097 +2024-05-21 12:31:50,561 - INFO - Test loss : 0.4525 +2024-05-21 12:31:50,561 - INFO - +Epoch 13 +2024-05-21 12:32:44,352 - INFO - [65536/66739] Loss : 1.0022 +2024-05-21 12:32:47,872 - INFO - Test loss : 0.4452 +2024-05-21 12:32:47,872 - INFO - +Epoch 14 +2024-05-21 12:33:41,261 - INFO - [65536/66739] Loss : 1.0005 +2024-05-21 12:33:44,660 - INFO - Test loss : 0.4502 +2024-05-21 12:33:44,660 - INFO - +Epoch 15 +2024-05-21 12:34:38,207 - INFO - [65536/66739] Loss : 0.9954 +2024-05-21 12:34:41,348 - INFO - Test loss : 0.4362 +2024-05-21 12:34:41,348 - INFO - +Epoch 16 +2024-05-21 12:35:35,024 - INFO - [65536/66739] Loss : 0.9929 +2024-05-21 12:35:38,590 - INFO - Test loss : 0.4362 +2024-05-21 12:35:38,590 - INFO - +Epoch 17 +2024-05-21 12:36:32,393 - INFO - [65536/66739] Loss : 0.9889 +2024-05-21 12:36:35,912 - INFO - Test loss : 0.4416 +2024-05-21 12:36:35,913 - INFO - +Epoch 18 +2024-05-21 12:37:28,812 - INFO - [65536/66739] Loss : 0.9867 +2024-05-21 12:37:32,365 - INFO - Test loss : 0.4643 +2024-05-21 12:37:32,365 - INFO - +Epoch 19 +2024-05-21 12:38:23,340 - INFO - [65536/66739] Loss : 0.9824 +2024-05-21 12:38:25,837 - INFO - Test loss : 0.4492 +2024-05-21 12:38:25,837 - INFO - +Epoch 20 +2024-05-21 12:39:17,721 - INFO - [65536/66739] Loss : 0.9799 +2024-05-21 12:39:21,222 - INFO - Test loss : 0.4576 +2024-05-21 12:39:21,256 - INFO - (4449,) +2024-05-21 12:39:21,278 - INFO - Split ID: 0 +2024-05-21 12:39:21,280 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 12.5 +2024-05-21 12:39:21,280 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 27.02 +2024-05-21 12:39:21,280 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 36.28 +2024-05-21 12:39:21,280 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 50.98 +2024-05-21 12:39:21,281 - INFO - +No prior +2024-05-21 12:39:21,283 - INFO - (17798,) +2024-05-21 12:39:21,377 - INFO - Split ID: 0 +2024-05-21 12:39:21,378 - INFO - Top 1 (Epoch 20)acc (%): 50.15 +2024-05-21 12:39:21,378 - INFO - Top 3 (Epoch 20)acc (%): 73.9 +2024-05-21 12:39:21,378 - INFO - Top 5 (Epoch 20)acc (%): 82.45 +2024-05-21 12:39:21,378 - INFO - Top 10 (Epoch 20)acc (%): 91.06 +2024-05-21 12:39:40,370 - INFO - Split ID: 0 +2024-05-21 12:39:40,370 - INFO - Top 1 (Epoch 20)acc (%): 51.25 +2024-05-21 12:39:40,370 - INFO - Top 3 (Epoch 20)acc (%): 75.37 +2024-05-21 12:39:40,370 - INFO - Top 5 (Epoch 20)acc (%): 83.89 +2024-05-21 12:39:40,371 - INFO - Top 10 (Epoch 20)acc (%): 92.29 +2024-05-21 12:39:40,371 - INFO - +Epoch 21 +2024-05-21 12:40:32,054 - INFO - [65536/66739] Loss : 0.9758 +2024-05-21 12:40:35,602 - INFO - Test loss : 0.4538 +2024-05-21 12:40:35,602 - INFO - +Epoch 22 +2024-05-21 12:41:28,363 - INFO - [65536/66739] Loss : 0.9766 +2024-05-21 12:41:32,027 - INFO - Test loss : 0.4845 +2024-05-21 12:41:32,027 - INFO - +Epoch 23 +2024-05-21 12:42:24,476 - INFO - [65536/66739] Loss : 0.9733 +2024-05-21 12:42:28,021 - INFO - Test loss : 0.4577 +2024-05-21 12:42:28,021 - INFO - +Epoch 24 +2024-05-21 12:43:21,489 - INFO - [65536/66739] Loss : 0.9680 +2024-05-21 12:43:25,006 - INFO - Test loss : 0.4775 +2024-05-21 12:43:25,006 - INFO - +Epoch 25 +2024-05-21 12:44:18,239 - INFO - [65536/66739] Loss : 0.9656 +2024-05-21 12:44:21,756 - INFO - Test loss : 0.4841 +2024-05-21 12:44:21,757 - INFO - +Epoch 26 +2024-05-21 12:45:14,653 - INFO - [65536/66739] Loss : 0.9651 +2024-05-21 12:45:18,163 - INFO - Test loss : 0.4982 +2024-05-21 12:45:18,163 - INFO - +Epoch 27 +2024-05-21 12:46:11,290 - INFO - [65536/66739] Loss : 0.9611 +2024-05-21 12:46:14,479 - INFO - Test loss : 0.4931 +2024-05-21 12:46:14,479 - INFO - +Epoch 28 +2024-05-21 12:47:08,225 - INFO - [65536/66739] Loss : 0.9601 +2024-05-21 12:47:11,466 - INFO - Test loss : 0.5132 +2024-05-21 12:47:11,466 - INFO - +Epoch 29 +2024-05-21 12:48:02,628 - INFO - [65536/66739] Loss : 0.9574 +2024-05-21 12:48:06,194 - INFO - Test loss : 0.5021 +2024-05-21 12:48:06,195 - INFO - Saving output model to ../models/space2vec_grid/model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-21 12:48:06,203 - INFO - Saving output model to ../models/space2vec_grid/model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-21 12:48:06,299 - INFO - +No prior +2024-05-21 12:48:06,301 - INFO - (17798,) +2024-05-21 12:48:06,395 - INFO - Split ID: 0 +2024-05-21 12:48:06,395 - INFO - Top 1 acc (%): 50.15 +2024-05-21 12:48:06,396 - INFO - Top 3 acc (%): 73.9 +2024-05-21 12:48:06,396 - INFO - Top 5 acc (%): 82.45 +2024-05-21 12:48:06,396 - INFO - Top 10 acc (%): 91.06 +2024-05-21 12:48:25,080 - INFO - Split ID: 0 +2024-05-21 12:48:25,080 - INFO - Top 1 acc (%): 51.25 +2024-05-21 12:48:25,080 - INFO - Top 3 acc (%): 75.5 +2024-05-21 12:48:25,080 - INFO - Top 5 acc (%): 83.92 +2024-05-21 12:48:25,080 - INFO - Top 10 acc (%): 92.31 +2024-05-21 12:48:25,080 - INFO - +Space2Vec-grid +2024-05-21 12:48:25,080 - INFO - Model : model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-21 12:48:25,123 - INFO - (4449,) +2024-05-21 12:48:25,148 - INFO - Split ID: 0 +2024-05-21 12:48:25,150 - INFO - Top 1 LocEnc acc (%): 12.65 +2024-05-21 12:48:25,150 - INFO - Top 3 LocEnc acc (%): 27.71 +2024-05-21 12:48:25,150 - INFO - Top 5 LocEnc acc (%): 36.73 +2024-05-21 12:48:25,150 - INFO - Top 10 LocEnc acc (%): 51.47 +2024-05-31 01:40:47,497 - INFO - +num_classes 100 +2024-05-31 01:40:47,497 - INFO - num train 66739 +2024-05-31 01:40:47,497 - INFO - num val 4449 +2024-05-31 01:40:47,497 - INFO - train loss full_loss +2024-05-31 01:40:47,497 - INFO - model name ../models/space2vec_grid/model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:40:47,497 - INFO - num users 1 +2024-05-31 01:40:48,238 - INFO - +Only Space2Vec-grid +2024-05-31 01:40:48,239 - INFO - Model : model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:40:48,314 - INFO - Saving output model to ../models/space2vec_grid/model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:40:48,336 - INFO - Saving output model to ../models/space2vec_grid/model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:40:48,350 - INFO - +No prior +2024-05-31 01:40:48,355 - INFO - (17798,) +2024-05-31 01:40:48,752 - INFO - Save results to ../eval_results/eval_yfcc__test_no_prior.csv +2024-05-31 01:40:48,752 - INFO - Split ID: 0 +2024-05-31 01:40:48,752 - INFO - Top 1 acc (%): 50.15 +2024-05-31 01:40:48,752 - INFO - Top 3 acc (%): 73.9 +2024-05-31 01:40:48,752 - INFO - Top 5 acc (%): 82.45 +2024-05-31 01:40:48,753 - INFO - Top 10 acc (%): 91.06 +2024-05-31 01:41:00,448 - INFO - Split ID: 0 +2024-05-31 01:41:00,448 - INFO - Top 1 hit (%): 51.25 +2024-05-31 01:41:00,448 - INFO - Top 3 hit (%): 75.5 +2024-05-31 01:41:00,448 - INFO - Top 5 hit (%): 83.92 +2024-05-31 01:41:00,448 - INFO - Top 10 hit (%): 92.31 +2024-05-31 01:41:00,452 - INFO - +Only Space2Vec-grid +2024-05-31 01:41:00,452 - INFO - Model : model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:41:00,513 - INFO - (4449,) +2024-05-31 01:41:00,537 - INFO - Split ID: 0 +2024-05-31 01:41:00,539 - INFO - Top 1 LocEnc acc (%): 12.65 +2024-05-31 01:41:00,539 - INFO - Top 3 LocEnc acc (%): 27.71 +2024-05-31 01:41:00,539 - INFO - Top 5 LocEnc acc (%): 36.73 +2024-05-31 01:41:00,540 - INFO - Top 10 LocEnc acc (%): 51.47 diff --git a/pre_trained_models/space2vec_grid/model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/space2vec_grid/model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..90b860c3 Binary files /dev/null and b/pre_trained_models/space2vec_grid/model_yfcc_Space2Vec-grid_inception_v3_0.0100_64_0.0050000_360.000_1_512_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_theory/model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.log b/pre_trained_models/space2vec_theory/model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..946b86bb --- /dev/null +++ b/pre_trained_models/space2vec_theory/model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.log @@ -0,0 +1,226 @@ +2024-05-23 15:29:21,430 - INFO - +num_classes 500 +2024-05-23 15:29:21,430 - INFO - num train 42490 +2024-05-23 15:29:21,430 - INFO - num val 980 +2024-05-23 15:29:21,430 - INFO - train loss full_loss +2024-05-23 15:29:21,430 - INFO - model name ../models/space2vec_theory/model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 15:29:21,430 - INFO - num users 5763 +2024-05-23 15:29:21,430 - INFO - meta data ebird_meta +2024-05-23 15:29:22,285 - INFO - +Epoch 0 +2024-05-23 15:30:04,802 - INFO - [40960/42490] Loss : 1.5200 +2024-05-23 15:30:05,766 - INFO - Test loss : 0.6723 +2024-05-23 15:30:05,766 - INFO - +Epoch 1 +2024-05-23 15:30:47,683 - INFO - [40960/42490] Loss : 1.0729 +2024-05-23 15:30:48,647 - INFO - Test loss : 0.2771 +2024-05-23 15:30:48,648 - INFO - +Epoch 2 +2024-05-23 15:31:30,593 - INFO - [40960/42490] Loss : 0.9481 +2024-05-23 15:31:31,538 - INFO - Test loss : 0.2386 +2024-05-23 15:31:31,538 - INFO - +Epoch 3 +2024-05-23 15:32:13,439 - INFO - [40960/42490] Loss : 0.8939 +2024-05-23 15:32:14,396 - INFO - Test loss : 0.2383 +2024-05-23 15:32:14,396 - INFO - +Epoch 4 +2024-05-23 15:32:56,351 - INFO - [40960/42490] Loss : 0.8612 +2024-05-23 15:32:57,314 - INFO - Test loss : 0.2500 +2024-05-23 15:32:57,314 - INFO - +Epoch 5 +2024-05-23 15:33:39,200 - INFO - [40960/42490] Loss : 0.8396 +2024-05-23 15:33:40,163 - INFO - Test loss : 0.2586 +2024-05-23 15:33:40,163 - INFO - +Epoch 6 +2024-05-23 15:34:22,026 - INFO - [40960/42490] Loss : 0.8219 +2024-05-23 15:34:23,005 - INFO - Test loss : 0.2620 +2024-05-23 15:34:23,005 - INFO - +Epoch 7 +2024-05-23 15:35:05,156 - INFO - [40960/42490] Loss : 0.8068 +2024-05-23 15:35:06,119 - INFO - Test loss : 0.2656 +2024-05-23 15:35:06,119 - INFO - +Epoch 8 +2024-05-23 15:35:48,040 - INFO - [40960/42490] Loss : 0.7973 +2024-05-23 15:35:49,002 - INFO - Test loss : 0.2655 +2024-05-23 15:35:49,003 - INFO - +Epoch 9 +2024-05-23 15:36:30,905 - INFO - [40960/42490] Loss : 0.7858 +2024-05-23 15:36:31,852 - INFO - Test loss : 0.2658 +2024-05-23 15:36:31,852 - INFO - +Epoch 10 +2024-05-23 15:37:13,712 - INFO - [40960/42490] Loss : 0.7800 +2024-05-23 15:37:14,675 - INFO - Test loss : 0.2795 +2024-05-23 15:37:14,688 - INFO - (980,) +2024-05-23 15:37:14,722 - INFO - Split ID: 0 +2024-05-23 15:37:14,723 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 5.41 +2024-05-23 15:37:14,723 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 12.14 +2024-05-23 15:37:14,723 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 16.94 +2024-05-23 15:37:14,723 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 28.06 +2024-05-23 15:37:14,723 - INFO - +No prior +2024-05-23 15:37:14,724 - INFO - (2262,) +2024-05-23 15:37:14,777 - INFO - Split ID: 0 +2024-05-23 15:37:14,777 - INFO - Top 1 (Epoch 10)acc (%): 70.07 +2024-05-23 15:37:14,777 - INFO - Top 3 (Epoch 10)acc (%): 86.6 +2024-05-23 15:37:14,777 - INFO - Top 5 (Epoch 10)acc (%): 90.05 +2024-05-23 15:37:14,777 - INFO - Top 10 (Epoch 10)acc (%): 92.88 +2024-05-23 15:37:17,719 - INFO - Split ID: 0 +2024-05-23 15:37:17,719 - INFO - Top 1 (Epoch 10)acc (%): 78.82 +2024-05-23 15:37:17,719 - INFO - Top 3 (Epoch 10)acc (%): 90.94 +2024-05-23 15:37:17,719 - INFO - Top 5 (Epoch 10)acc (%): 93.32 +2024-05-23 15:37:17,719 - INFO - Top 10 (Epoch 10)acc (%): 95.76 +2024-05-23 15:37:17,719 - INFO - +Epoch 11 +2024-05-23 15:37:59,636 - INFO - [40960/42490] Loss : 0.7721 +2024-05-23 15:38:00,641 - INFO - Test loss : 0.2863 +2024-05-23 15:38:00,641 - INFO - +Epoch 12 +2024-05-23 15:38:44,382 - INFO - [40960/42490] Loss : 0.7666 +2024-05-23 15:38:45,384 - INFO - Test loss : 0.2958 +2024-05-23 15:38:45,384 - INFO - +Epoch 13 +2024-05-23 15:39:27,486 - INFO - [40960/42490] Loss : 0.7574 +2024-05-23 15:39:28,449 - INFO - Test loss : 0.3029 +2024-05-23 15:39:28,449 - INFO - +Epoch 14 +2024-05-23 15:40:10,295 - INFO - [40960/42490] Loss : 0.7540 +2024-05-23 15:40:11,238 - INFO - Test loss : 0.3021 +2024-05-23 15:40:11,238 - INFO - +Epoch 15 +2024-05-23 15:40:53,075 - INFO - [40960/42490] Loss : 0.7476 +2024-05-23 15:40:54,037 - INFO - Test loss : 0.2923 +2024-05-23 15:40:54,038 - INFO - +Epoch 16 +2024-05-23 15:41:35,954 - INFO - [40960/42490] Loss : 0.7437 +2024-05-23 15:41:36,916 - INFO - Test loss : 0.3095 +2024-05-23 15:41:36,916 - INFO - +Epoch 17 +2024-05-23 15:42:18,814 - INFO - [40960/42490] Loss : 0.7400 +2024-05-23 15:42:19,774 - INFO - Test loss : 0.3151 +2024-05-23 15:42:19,774 - INFO - +Epoch 18 +2024-05-23 15:43:01,659 - INFO - [40960/42490] Loss : 0.7368 +2024-05-23 15:43:02,622 - INFO - Test loss : 0.3135 +2024-05-23 15:43:02,622 - INFO - +Epoch 19 +2024-05-23 15:43:44,527 - INFO - [40960/42490] Loss : 0.7318 +2024-05-23 15:43:45,491 - INFO - Test loss : 0.3276 +2024-05-23 15:43:45,491 - INFO - +Epoch 20 +2024-05-23 15:44:27,388 - INFO - [40960/42490] Loss : 0.7292 +2024-05-23 15:44:28,352 - INFO - Test loss : 0.3260 +2024-05-23 15:44:28,363 - INFO - (980,) +2024-05-23 15:44:28,401 - INFO - Split ID: 0 +2024-05-23 15:44:28,402 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 6.33 +2024-05-23 15:44:28,402 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 12.96 +2024-05-23 15:44:28,402 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 19.18 +2024-05-23 15:44:28,402 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 29.59 +2024-05-23 15:44:28,402 - INFO - +No prior +2024-05-23 15:44:28,403 - INFO - (2262,) +2024-05-23 15:44:28,455 - INFO - Split ID: 0 +2024-05-23 15:44:28,455 - INFO - Top 1 (Epoch 20)acc (%): 70.07 +2024-05-23 15:44:28,455 - INFO - Top 3 (Epoch 20)acc (%): 86.6 +2024-05-23 15:44:28,455 - INFO - Top 5 (Epoch 20)acc (%): 90.05 +2024-05-23 15:44:28,455 - INFO - Top 10 (Epoch 20)acc (%): 92.88 +2024-05-23 15:44:31,394 - INFO - Split ID: 0 +2024-05-23 15:44:31,394 - INFO - Top 1 (Epoch 20)acc (%): 79.62 +2024-05-23 15:44:31,394 - INFO - Top 3 (Epoch 20)acc (%): 91.29 +2024-05-23 15:44:31,394 - INFO - Top 5 (Epoch 20)acc (%): 93.77 +2024-05-23 15:44:31,394 - INFO - Top 10 (Epoch 20)acc (%): 95.84 +2024-05-23 15:44:31,394 - INFO - +Epoch 21 +2024-05-23 15:45:13,346 - INFO - [40960/42490] Loss : 0.7227 +2024-05-23 15:45:14,310 - INFO - Test loss : 0.3283 +2024-05-23 15:45:14,310 - INFO - +Epoch 22 +2024-05-23 15:45:56,208 - INFO - [40960/42490] Loss : 0.7208 +2024-05-23 15:45:57,172 - INFO - Test loss : 0.3381 +2024-05-23 15:45:57,172 - INFO - +Epoch 23 +2024-05-23 15:46:39,173 - INFO - [40960/42490] Loss : 0.7175 +2024-05-23 15:46:40,153 - INFO - Test loss : 0.3397 +2024-05-23 15:46:40,153 - INFO - +Epoch 24 +2024-05-23 15:47:15,999 - INFO - [40960/42490] Loss : 0.7150 +2024-05-23 15:47:16,986 - INFO - Test loss : 0.3515 +2024-05-23 15:47:16,986 - INFO - +Epoch 25 +2024-05-23 15:47:59,942 - INFO - [40960/42490] Loss : 0.7103 +2024-05-23 15:48:00,929 - INFO - Test loss : 0.3549 +2024-05-23 15:48:00,929 - INFO - +Epoch 26 +2024-05-23 15:48:44,376 - INFO - [40960/42490] Loss : 0.7131 +2024-05-23 15:48:45,065 - INFO - Test loss : 0.3558 +2024-05-23 15:48:45,066 - INFO - +Epoch 27 +2024-05-23 15:49:17,405 - INFO - [40960/42490] Loss : 0.7053 +2024-05-23 15:49:18,143 - INFO - Test loss : 0.3659 +2024-05-23 15:49:18,143 - INFO - +Epoch 28 +2024-05-23 15:49:49,655 - INFO - [40960/42490] Loss : 0.7039 +2024-05-23 15:49:50,376 - INFO - Test loss : 0.3613 +2024-05-23 15:49:50,376 - INFO - +Epoch 29 +2024-05-23 15:50:31,779 - INFO - [40960/42490] Loss : 0.7025 +2024-05-23 15:50:32,703 - INFO - Test loss : 0.3662 +2024-05-23 15:50:32,704 - INFO - Saving output model to ../models/space2vec_theory/model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 15:50:32,720 - INFO - Saving output model to ../models/space2vec_theory/model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 15:50:32,757 - INFO - +No prior +2024-05-23 15:50:32,759 - INFO - (2262,) +2024-05-23 15:50:32,811 - INFO - Split ID: 0 +2024-05-23 15:50:32,811 - INFO - Top 1 acc (%): 70.07 +2024-05-23 15:50:32,812 - INFO - Top 3 acc (%): 86.6 +2024-05-23 15:50:32,812 - INFO - Top 5 acc (%): 90.05 +2024-05-23 15:50:32,812 - INFO - Top 10 acc (%): 92.88 +2024-05-23 15:50:35,753 - INFO - Split ID: 0 +2024-05-23 15:50:35,753 - INFO - Top 1 acc (%): 80.11 +2024-05-23 15:50:35,754 - INFO - Top 3 acc (%): 91.29 +2024-05-23 15:50:35,754 - INFO - Top 5 acc (%): 93.77 +2024-05-23 15:50:35,754 - INFO - Top 10 acc (%): 96.02 +2024-05-23 15:50:35,754 - INFO - +Space2Vec-theory +2024-05-23 15:50:35,754 - INFO - Model : model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 15:50:35,770 - INFO - (980,) +2024-05-23 15:50:35,804 - INFO - Split ID: 0 +2024-05-23 15:50:35,805 - INFO - Top 1 LocEnc acc (%): 6.53 +2024-05-23 15:50:35,805 - INFO - Top 3 LocEnc acc (%): 14.08 +2024-05-23 15:50:35,805 - INFO - Top 5 LocEnc acc (%): 19.49 +2024-05-23 15:50:35,805 - INFO - Top 10 LocEnc acc (%): 30.31 +2024-05-31 01:49:04,679 - INFO - +num_classes 500 +2024-05-31 01:49:04,680 - INFO - num train 42490 +2024-05-31 01:49:04,680 - INFO - num val 980 +2024-05-31 01:49:04,680 - INFO - train loss full_loss +2024-05-31 01:49:04,680 - INFO - model name ../models/space2vec_theory/model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:49:04,680 - INFO - num users 5763 +2024-05-31 01:49:04,680 - INFO - meta data ebird_meta +2024-05-31 01:49:05,500 - INFO - +Only Space2Vec-theory +2024-05-31 01:49:05,500 - INFO - Model : model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:49:05,596 - INFO - Saving output model to ../models/space2vec_theory/model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:49:05,642 - INFO - Saving output model to ../models/space2vec_theory/model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:49:05,690 - INFO - +No prior +2024-05-31 01:49:05,693 - INFO - (2262,) +2024-05-31 01:49:05,820 - INFO - Save results to ../eval_results/eval_birdsnap_ebird_meta_test_no_prior.csv +2024-05-31 01:49:05,820 - INFO - Split ID: 0 +2024-05-31 01:49:05,821 - INFO - Top 1 acc (%): 70.07 +2024-05-31 01:49:05,821 - INFO - Top 3 acc (%): 86.6 +2024-05-31 01:49:05,821 - INFO - Top 5 acc (%): 90.05 +2024-05-31 01:49:05,821 - INFO - Top 10 acc (%): 92.88 +2024-05-31 01:49:07,808 - INFO - Split ID: 0 +2024-05-31 01:49:07,808 - INFO - Top 1 hit (%): 80.11 +2024-05-31 01:49:07,808 - INFO - Top 3 hit (%): 91.29 +2024-05-31 01:49:07,808 - INFO - Top 5 hit (%): 93.77 +2024-05-31 01:49:07,808 - INFO - Top 10 hit (%): 96.02 +2024-05-31 01:49:07,809 - INFO - +Only Space2Vec-theory +2024-05-31 01:49:07,809 - INFO - Model : model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:49:07,850 - INFO - (980,) +2024-05-31 01:49:07,887 - INFO - Split ID: 0 +2024-05-31 01:49:07,887 - INFO - Top 1 LocEnc acc (%): 6.53 +2024-05-31 01:49:07,887 - INFO - Top 3 LocEnc acc (%): 14.08 +2024-05-31 01:49:07,887 - INFO - Top 5 LocEnc acc (%): 19.49 +2024-05-31 01:49:07,888 - INFO - Top 10 LocEnc acc (%): 30.31 diff --git a/pre_trained_models/space2vec_theory/model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/space2vec_theory/model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..4d36f8b9 Binary files /dev/null and b/pre_trained_models/space2vec_theory/model_birdsnap_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_theory/model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.log b/pre_trained_models/space2vec_theory/model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..b7665dd4 --- /dev/null +++ b/pre_trained_models/space2vec_theory/model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.log @@ -0,0 +1,226 @@ +2024-05-23 16:56:49,197 - INFO - +num_classes 500 +2024-05-23 16:56:49,197 - INFO - num train 19133 +2024-05-23 16:56:49,197 - INFO - num val 443 +2024-05-23 16:56:49,197 - INFO - train loss full_loss +2024-05-23 16:56:49,198 - INFO - model name ../models/space2vec_theory/model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 16:56:49,198 - INFO - num users 2872 +2024-05-23 16:56:49,198 - INFO - meta data orig_meta +2024-05-23 16:56:50,088 - INFO - +Epoch 0 +2024-05-23 16:57:08,282 - INFO - [16384/19133] Loss : 1.7678 +2024-05-23 16:57:08,567 - INFO - Test loss : 0.4243 +2024-05-23 16:57:08,567 - INFO - +Epoch 1 +2024-05-23 16:57:24,287 - INFO - [16384/19133] Loss : 1.3865 +2024-05-23 16:57:24,311 - INFO - Test loss : 0.6888 +2024-05-23 16:57:24,311 - INFO - +Epoch 2 +2024-05-23 16:57:39,584 - INFO - [16384/19133] Loss : 1.2173 +2024-05-23 16:57:40,117 - INFO - Test loss : 0.4907 +2024-05-23 16:57:40,117 - INFO - +Epoch 3 +2024-05-23 16:57:58,455 - INFO - [16384/19133] Loss : 1.1123 +2024-05-23 16:57:58,745 - INFO - Test loss : 0.3817 +2024-05-23 16:57:58,745 - INFO - +Epoch 4 +2024-05-23 16:58:14,443 - INFO - [16384/19133] Loss : 1.0532 +2024-05-23 16:58:14,727 - INFO - Test loss : 0.4508 +2024-05-23 16:58:14,727 - INFO - +Epoch 5 +2024-05-23 16:58:30,194 - INFO - [16384/19133] Loss : 0.9950 +2024-05-23 16:58:30,694 - INFO - Test loss : 0.4566 +2024-05-23 16:58:30,694 - INFO - +Epoch 6 +2024-05-23 16:58:47,952 - INFO - [16384/19133] Loss : 0.9545 +2024-05-23 16:58:48,489 - INFO - Test loss : 0.4298 +2024-05-23 16:58:48,489 - INFO - +Epoch 7 +2024-05-23 16:59:05,759 - INFO - [16384/19133] Loss : 0.9222 +2024-05-23 16:59:06,297 - INFO - Test loss : 0.4749 +2024-05-23 16:59:06,297 - INFO - +Epoch 8 +2024-05-23 16:59:22,595 - INFO - [16384/19133] Loss : 0.8880 +2024-05-23 16:59:22,887 - INFO - Test loss : 0.4606 +2024-05-23 16:59:22,887 - INFO - +Epoch 9 +2024-05-23 16:59:40,345 - INFO - [16384/19133] Loss : 0.8690 +2024-05-23 16:59:40,604 - INFO - Test loss : 0.4722 +2024-05-23 16:59:40,605 - INFO - +Epoch 10 +2024-05-23 16:59:57,815 - INFO - [16384/19133] Loss : 0.8407 +2024-05-23 16:59:58,148 - INFO - Test loss : 0.4865 +2024-05-23 16:59:58,158 - INFO - (443,) +2024-05-23 16:59:58,174 - INFO - Split ID: 0 +2024-05-23 16:59:58,175 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 6.32 +2024-05-23 16:59:58,175 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 11.51 +2024-05-23 16:59:58,175 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 15.8 +2024-05-23 16:59:58,175 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 24.83 +2024-05-23 16:59:58,175 - INFO - +No prior +2024-05-23 16:59:58,176 - INFO - (2262,) +2024-05-23 16:59:58,229 - INFO - Split ID: 0 +2024-05-23 16:59:58,229 - INFO - Top 1 (Epoch 10)acc (%): 70.07 +2024-05-23 16:59:58,229 - INFO - Top 3 (Epoch 10)acc (%): 86.6 +2024-05-23 16:59:58,229 - INFO - Top 5 (Epoch 10)acc (%): 90.05 +2024-05-23 16:59:58,229 - INFO - Top 10 (Epoch 10)acc (%): 92.88 +2024-05-23 16:59:59,602 - INFO - Split ID: 0 +2024-05-23 16:59:59,602 - INFO - Top 1 (Epoch 10)acc (%): 71.57 +2024-05-23 16:59:59,602 - INFO - Top 3 (Epoch 10)acc (%): 86.47 +2024-05-23 16:59:59,602 - INFO - Top 5 (Epoch 10)acc (%): 90.01 +2024-05-23 16:59:59,602 - INFO - Top 10 (Epoch 10)acc (%): 93.32 +2024-05-23 16:59:59,602 - INFO - +Epoch 11 +2024-05-23 17:00:17,452 - INFO - [16384/19133] Loss : 0.8268 +2024-05-23 17:00:17,905 - INFO - Test loss : 0.4813 +2024-05-23 17:00:17,905 - INFO - +Epoch 12 +2024-05-23 17:00:35,150 - INFO - [16384/19133] Loss : 0.8086 +2024-05-23 17:00:35,477 - INFO - Test loss : 0.5216 +2024-05-23 17:00:35,478 - INFO - +Epoch 13 +2024-05-23 17:00:52,716 - INFO - [16384/19133] Loss : 0.7928 +2024-05-23 17:00:53,043 - INFO - Test loss : 0.5154 +2024-05-23 17:00:53,043 - INFO - +Epoch 14 +2024-05-23 17:01:10,371 - INFO - [16384/19133] Loss : 0.7796 +2024-05-23 17:01:10,659 - INFO - Test loss : 0.5331 +2024-05-23 17:01:10,659 - INFO - +Epoch 15 +2024-05-23 17:01:27,409 - INFO - [16384/19133] Loss : 0.7589 +2024-05-23 17:01:27,717 - INFO - Test loss : 0.5682 +2024-05-23 17:01:27,718 - INFO - +Epoch 16 +2024-05-23 17:01:44,941 - INFO - [16384/19133] Loss : 0.7533 +2024-05-23 17:01:45,227 - INFO - Test loss : 0.5605 +2024-05-23 17:01:45,227 - INFO - +Epoch 17 +2024-05-23 17:02:02,609 - INFO - [16384/19133] Loss : 0.7450 +2024-05-23 17:02:03,150 - INFO - Test loss : 0.5665 +2024-05-23 17:02:03,150 - INFO - +Epoch 18 +2024-05-23 17:02:20,193 - INFO - [16384/19133] Loss : 0.7290 +2024-05-23 17:02:20,733 - INFO - Test loss : 0.5826 +2024-05-23 17:02:20,733 - INFO - +Epoch 19 +2024-05-23 17:02:38,048 - INFO - [16384/19133] Loss : 0.7211 +2024-05-23 17:02:38,374 - INFO - Test loss : 0.5841 +2024-05-23 17:02:38,374 - INFO - +Epoch 20 +2024-05-23 17:02:55,919 - INFO - [16384/19133] Loss : 0.7144 +2024-05-23 17:02:56,245 - INFO - Test loss : 0.5960 +2024-05-23 17:02:56,251 - INFO - (443,) +2024-05-23 17:02:56,267 - INFO - Split ID: 0 +2024-05-23 17:02:56,268 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 5.42 +2024-05-23 17:02:56,268 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 13.77 +2024-05-23 17:02:56,268 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 19.19 +2024-05-23 17:02:56,268 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 29.12 +2024-05-23 17:02:56,268 - INFO - +No prior +2024-05-23 17:02:56,269 - INFO - (2262,) +2024-05-23 17:02:56,320 - INFO - Split ID: 0 +2024-05-23 17:02:56,321 - INFO - Top 1 (Epoch 20)acc (%): 70.07 +2024-05-23 17:02:56,321 - INFO - Top 3 (Epoch 20)acc (%): 86.6 +2024-05-23 17:02:56,321 - INFO - Top 5 (Epoch 20)acc (%): 90.05 +2024-05-23 17:02:56,321 - INFO - Top 10 (Epoch 20)acc (%): 92.88 +2024-05-23 17:02:57,866 - INFO - Split ID: 0 +2024-05-23 17:02:57,866 - INFO - Top 1 (Epoch 20)acc (%): 71.93 +2024-05-23 17:02:57,866 - INFO - Top 3 (Epoch 20)acc (%): 86.52 +2024-05-23 17:02:57,866 - INFO - Top 5 (Epoch 20)acc (%): 89.92 +2024-05-23 17:02:57,866 - INFO - Top 10 (Epoch 20)acc (%): 93.15 +2024-05-23 17:02:57,866 - INFO - +Epoch 21 +2024-05-23 17:03:15,196 - INFO - [16384/19133] Loss : 0.7087 +2024-05-23 17:03:15,735 - INFO - Test loss : 0.6357 +2024-05-23 17:03:15,736 - INFO - +Epoch 22 +2024-05-23 17:03:32,737 - INFO - [16384/19133] Loss : 0.7005 +2024-05-23 17:03:33,005 - INFO - Test loss : 0.6378 +2024-05-23 17:03:33,005 - INFO - +Epoch 23 +2024-05-23 17:03:50,615 - INFO - [16384/19133] Loss : 0.6916 +2024-05-23 17:03:51,158 - INFO - Test loss : 0.6639 +2024-05-23 17:03:51,159 - INFO - +Epoch 24 +2024-05-23 17:04:05,646 - INFO - [16384/19133] Loss : 0.6876 +2024-05-23 17:04:06,078 - INFO - Test loss : 0.6553 +2024-05-23 17:04:06,078 - INFO - +Epoch 25 +2024-05-23 17:04:20,530 - INFO - [16384/19133] Loss : 0.6765 +2024-05-23 17:04:20,856 - INFO - Test loss : 0.6747 +2024-05-23 17:04:20,857 - INFO - +Epoch 26 +2024-05-23 17:04:37,318 - INFO - [16384/19133] Loss : 0.6700 +2024-05-23 17:04:37,837 - INFO - Test loss : 0.7051 +2024-05-23 17:04:37,838 - INFO - +Epoch 27 +2024-05-23 17:04:55,926 - INFO - [16384/19133] Loss : 0.6653 +2024-05-23 17:04:56,358 - INFO - Test loss : 0.6967 +2024-05-23 17:04:56,358 - INFO - +Epoch 28 +2024-05-23 17:05:13,333 - INFO - [16384/19133] Loss : 0.6603 +2024-05-23 17:05:13,841 - INFO - Test loss : 0.7098 +2024-05-23 17:05:13,841 - INFO - +Epoch 29 +2024-05-23 17:05:31,283 - INFO - [16384/19133] Loss : 0.6554 +2024-05-23 17:05:31,612 - INFO - Test loss : 0.7359 +2024-05-23 17:05:31,612 - INFO - Saving output model to ../models/space2vec_theory/model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 17:05:31,624 - INFO - Saving output model to ../models/space2vec_theory/model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 17:05:31,657 - INFO - +No prior +2024-05-23 17:05:31,658 - INFO - (2262,) +2024-05-23 17:05:31,710 - INFO - Split ID: 0 +2024-05-23 17:05:31,710 - INFO - Top 1 acc (%): 70.07 +2024-05-23 17:05:31,710 - INFO - Top 3 acc (%): 86.6 +2024-05-23 17:05:31,710 - INFO - Top 5 acc (%): 90.05 +2024-05-23 17:05:31,710 - INFO - Top 10 acc (%): 92.88 +2024-05-23 17:05:33,130 - INFO - Split ID: 0 +2024-05-23 17:05:33,130 - INFO - Top 1 acc (%): 71.79 +2024-05-23 17:05:33,130 - INFO - Top 3 acc (%): 86.38 +2024-05-23 17:05:33,130 - INFO - Top 5 acc (%): 89.7 +2024-05-23 17:05:33,130 - INFO - Top 10 acc (%): 93.02 +2024-05-23 17:05:33,130 - INFO - +Space2Vec-theory +2024-05-23 17:05:33,130 - INFO - Model : model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 17:05:33,195 - INFO - (443,) +2024-05-23 17:05:33,211 - INFO - Split ID: 0 +2024-05-23 17:05:33,211 - INFO - Top 1 LocEnc acc (%): 7.67 +2024-05-23 17:05:33,211 - INFO - Top 3 LocEnc acc (%): 14.22 +2024-05-23 17:05:33,211 - INFO - Top 5 LocEnc acc (%): 19.41 +2024-05-23 17:05:33,212 - INFO - Top 10 LocEnc acc (%): 28.89 +2024-05-31 01:47:54,882 - INFO - +num_classes 500 +2024-05-31 01:47:54,883 - INFO - num train 19133 +2024-05-31 01:47:54,883 - INFO - num val 443 +2024-05-31 01:47:54,883 - INFO - train loss full_loss +2024-05-31 01:47:54,883 - INFO - model name ../models/space2vec_theory/model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:47:54,883 - INFO - num users 2872 +2024-05-31 01:47:54,883 - INFO - meta data orig_meta +2024-05-31 01:47:55,691 - INFO - +Only Space2Vec-theory +2024-05-31 01:47:55,691 - INFO - Model : model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:47:55,800 - INFO - Saving output model to ../models/space2vec_theory/model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:47:55,843 - INFO - Saving output model to ../models/space2vec_theory/model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:47:55,872 - INFO - +No prior +2024-05-31 01:47:55,874 - INFO - (2262,) +2024-05-31 01:47:55,993 - INFO - Save results to ../eval_results/eval_birdsnap_orig_meta_test_no_prior.csv +2024-05-31 01:47:55,993 - INFO - Split ID: 0 +2024-05-31 01:47:55,993 - INFO - Top 1 acc (%): 70.07 +2024-05-31 01:47:55,993 - INFO - Top 3 acc (%): 86.6 +2024-05-31 01:47:55,993 - INFO - Top 5 acc (%): 90.05 +2024-05-31 01:47:55,993 - INFO - Top 10 acc (%): 92.88 +2024-05-31 01:47:57,082 - INFO - Split ID: 0 +2024-05-31 01:47:57,082 - INFO - Top 1 hit (%): 71.79 +2024-05-31 01:47:57,082 - INFO - Top 3 hit (%): 86.38 +2024-05-31 01:47:57,082 - INFO - Top 5 hit (%): 89.7 +2024-05-31 01:47:57,082 - INFO - Top 10 hit (%): 93.02 +2024-05-31 01:47:57,083 - INFO - +Only Space2Vec-theory +2024-05-31 01:47:57,083 - INFO - Model : model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:47:57,119 - INFO - (443,) +2024-05-31 01:47:57,135 - INFO - Split ID: 0 +2024-05-31 01:47:57,136 - INFO - Top 1 LocEnc acc (%): 7.67 +2024-05-31 01:47:57,136 - INFO - Top 3 LocEnc acc (%): 14.22 +2024-05-31 01:47:57,136 - INFO - Top 5 LocEnc acc (%): 19.41 +2024-05-31 01:47:57,136 - INFO - Top 10 LocEnc acc (%): 28.89 diff --git a/pre_trained_models/space2vec_theory/model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/space2vec_theory/model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..d40132c9 Binary files /dev/null and b/pre_trained_models/space2vec_theory/model_birdsnap_orig_meta_Space2Vec-theory_inception_v3_0.0100_64_0.0100000_360.000_1_256_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_theory/model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.log b/pre_trained_models/space2vec_theory/model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.log new file mode 100755 index 00000000..8f4d1015 --- /dev/null +++ b/pre_trained_models/space2vec_theory/model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.log @@ -0,0 +1,755 @@ +2024-05-27 20:06:05,158 - INFO - +num_classes 62 +2024-05-27 20:06:05,158 - INFO - num train 363571 +2024-05-27 20:06:05,159 - INFO - num val 53041 +2024-05-27 20:06:05,159 - INFO - train loss full_loss +2024-05-27 20:06:05,159 - INFO - model name ../models/space2vec_theory/model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-27 20:06:05,159 - INFO - num users 1 +2024-05-27 20:06:06,186 - INFO - +Epoch 0 +2024-05-27 20:06:11,439 - INFO - [0/363571] Loss : 2.2314 +2024-05-27 20:06:49,156 - INFO - Test loss : 1.4377 +2024-05-27 20:06:49,156 - INFO - +Epoch 1 +2024-05-27 20:06:54,138 - INFO - [0/363571] Loss : 1.9264 +2024-05-27 20:07:31,342 - INFO - Test loss : 0.0055 +2024-05-27 20:07:31,342 - INFO - +Epoch 2 +2024-05-27 20:07:35,648 - INFO - [0/363571] Loss : 7.4665 +2024-05-27 20:08:13,525 - INFO - Test loss : 4.2835 +2024-05-27 20:08:13,526 - INFO - +Epoch 3 +2024-05-27 20:08:17,404 - INFO - [0/363571] Loss : 3.5183 +2024-05-27 20:08:55,697 - INFO - Test loss : 0.8571 +2024-05-27 20:08:55,697 - INFO - +Epoch 4 +2024-05-27 20:09:00,211 - INFO - [0/363571] Loss : 1.8578 +2024-05-27 20:09:35,997 - INFO - Test loss : 0.4170 +2024-05-27 20:09:35,997 - INFO - +Epoch 5 +2024-05-27 20:09:40,559 - INFO - [0/363571] Loss : 2.5701 +2024-05-27 20:10:18,812 - INFO - Test loss : 0.6623 +2024-05-27 20:10:18,812 - INFO - +Epoch 6 +2024-05-27 20:10:23,573 - INFO - [0/363571] Loss : 2.1876 +2024-05-27 20:11:01,049 - INFO - Test loss : 1.0405 +2024-05-27 20:11:01,049 - INFO - +Epoch 7 +2024-05-27 20:11:05,573 - INFO - [0/363571] Loss : 1.7983 +2024-05-27 20:11:43,912 - INFO - Test loss : 1.3891 +2024-05-27 20:11:43,913 - INFO - +Epoch 8 +2024-05-27 20:11:48,736 - INFO - [0/363571] Loss : 1.8244 +2024-05-27 20:12:27,176 - INFO - Test loss : 1.4946 +2024-05-27 20:12:27,177 - INFO - +Epoch 9 +2024-05-27 20:12:31,741 - INFO - [0/363571] Loss : 1.8530 +2024-05-27 20:13:09,058 - INFO - Test loss : 1.3219 +2024-05-27 20:13:09,059 - INFO - +Epoch 10 +2024-05-27 20:13:13,736 - INFO - [0/363571] Loss : 1.7448 +2024-05-27 20:13:52,337 - INFO - Test loss : 1.0610 +2024-05-27 20:13:53,320 - INFO - (53041,) +2024-05-27 20:13:53,498 - INFO - Split ID: 0 +2024-05-27 20:13:53,519 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 12.54 +2024-05-27 20:13:53,521 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 21.08 +2024-05-27 20:13:53,523 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 27.23 +2024-05-27 20:13:53,525 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 37.63 +2024-05-27 20:13:53,532 - INFO - +No prior +2024-05-27 20:13:53,537 - INFO - (53041,) +2024-05-27 20:13:53,702 - INFO - Split ID: 0 +2024-05-27 20:13:53,702 - INFO - Top 1 (Epoch 10)acc (%): 69.83 +2024-05-27 20:13:53,702 - INFO - Top 3 (Epoch 10)acc (%): 84.61 +2024-05-27 20:13:53,702 - INFO - Top 5 (Epoch 10)acc (%): 89.23 +2024-05-27 20:13:53,702 - INFO - Top 10 (Epoch 10)acc (%): 94.29 +2024-05-27 20:14:36,179 - INFO - Split ID: 0 +2024-05-27 20:14:36,180 - INFO - Top 1 (Epoch 10)acc (%): 70.03 +2024-05-27 20:14:36,180 - INFO - Top 3 (Epoch 10)acc (%): 84.76 +2024-05-27 20:14:36,180 - INFO - Top 5 (Epoch 10)acc (%): 89.43 +2024-05-27 20:14:36,180 - INFO - Top 10 (Epoch 10)acc (%): 94.38 +2024-05-27 20:14:36,181 - INFO - +Epoch 11 +2024-05-27 20:14:40,763 - INFO - [0/363571] Loss : 1.6631 +2024-05-27 20:15:19,292 - INFO - Test loss : 0.8803 +2024-05-27 20:15:19,292 - INFO - +Epoch 12 +2024-05-27 20:15:23,786 - INFO - [0/363571] Loss : 1.6547 +2024-05-27 20:16:01,519 - INFO - Test loss : 0.8364 +2024-05-27 20:16:01,519 - INFO - +Epoch 13 +2024-05-27 20:16:06,233 - INFO - [0/363571] Loss : 1.6808 +2024-05-27 20:16:44,212 - INFO - Test loss : 0.8908 +2024-05-27 20:16:44,212 - INFO - +Epoch 14 +2024-05-27 20:16:48,810 - INFO - [0/363571] Loss : 1.6710 +2024-05-27 20:17:25,789 - INFO - Test loss : 0.9804 +2024-05-27 20:17:25,790 - INFO - +Epoch 15 +2024-05-27 20:17:30,489 - INFO - [0/363571] Loss : 1.6446 +2024-05-27 20:18:08,852 - INFO - Test loss : 1.0363 +2024-05-27 20:18:08,853 - INFO - +Epoch 16 +2024-05-27 20:18:13,421 - INFO - [0/363571] Loss : 1.6061 +2024-05-27 20:18:50,179 - INFO - Test loss : 1.0269 +2024-05-27 20:18:50,181 - INFO - +Epoch 17 +2024-05-27 20:18:54,801 - INFO - [0/363571] Loss : 1.5861 +2024-05-27 20:19:32,306 - INFO - Test loss : 0.9761 +2024-05-27 20:19:32,306 - INFO - +Epoch 18 +2024-05-27 20:19:36,929 - INFO - [0/363571] Loss : 1.5843 +2024-05-27 20:20:10,605 - INFO - Test loss : 0.9219 +2024-05-27 20:20:10,605 - INFO - +Epoch 19 +2024-05-27 20:20:14,974 - INFO - [0/363571] Loss : 1.5708 +2024-05-27 20:20:48,140 - INFO - Test loss : 0.9007 +2024-05-27 20:20:48,140 - INFO - +Epoch 20 +2024-05-27 20:20:50,599 - INFO - [0/363571] Loss : 1.5553 +2024-05-27 20:21:25,280 - INFO - Test loss : 0.9228 +2024-05-27 20:21:26,298 - INFO - (53041,) +2024-05-27 20:21:26,460 - INFO - Split ID: 0 +2024-05-27 20:21:26,480 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 11.72 +2024-05-27 20:21:26,482 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 22.13 +2024-05-27 20:21:26,484 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 29.6 +2024-05-27 20:21:26,485 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 42.44 +2024-05-27 20:21:26,493 - INFO - +No prior +2024-05-27 20:21:26,500 - INFO - (53041,) +2024-05-27 20:21:26,665 - INFO - Split ID: 0 +2024-05-27 20:21:26,665 - INFO - Top 1 (Epoch 20)acc (%): 69.83 +2024-05-27 20:21:26,666 - INFO - Top 3 (Epoch 20)acc (%): 84.61 +2024-05-27 20:21:26,666 - INFO - Top 5 (Epoch 20)acc (%): 89.23 +2024-05-27 20:21:26,666 - INFO - Top 10 (Epoch 20)acc (%): 94.29 +2024-05-27 20:22:09,111 - INFO - Split ID: 0 +2024-05-27 20:22:09,112 - INFO - Top 1 (Epoch 20)acc (%): 70.19 +2024-05-27 20:22:09,112 - INFO - Top 3 (Epoch 20)acc (%): 84.89 +2024-05-27 20:22:09,112 - INFO - Top 5 (Epoch 20)acc (%): 89.49 +2024-05-27 20:22:09,112 - INFO - Top 10 (Epoch 20)acc (%): 94.43 +2024-05-27 20:22:09,113 - INFO - +Epoch 21 +2024-05-27 20:22:12,745 - INFO - [0/363571] Loss : 1.5548 +2024-05-27 20:22:47,565 - INFO - Test loss : 0.9621 +2024-05-27 20:22:47,565 - INFO - +Epoch 22 +2024-05-27 20:22:51,961 - INFO - [0/363571] Loss : 1.5281 +2024-05-27 20:23:27,786 - INFO - Test loss : 0.9817 +2024-05-27 20:23:27,786 - INFO - +Epoch 23 +2024-05-27 20:23:31,871 - INFO - [0/363571] Loss : 1.5174 +2024-05-27 20:24:04,803 - INFO - Test loss : 0.9595 +2024-05-27 20:24:04,804 - INFO - +Epoch 24 +2024-05-27 20:24:09,388 - INFO - [0/363571] Loss : 1.5033 +2024-05-27 20:24:44,192 - INFO - Test loss : 0.9105 +2024-05-27 20:24:44,192 - INFO - +Epoch 25 +2024-05-27 20:24:48,687 - INFO - [0/363571] Loss : 1.4934 +2024-05-27 20:25:23,454 - INFO - Test loss : 0.8590 +2024-05-27 20:25:23,454 - INFO - +Epoch 26 +2024-05-27 20:25:27,736 - INFO - [0/363571] Loss : 1.4893 +2024-05-27 20:26:01,492 - INFO - Test loss : 0.8268 +2024-05-27 20:26:01,493 - INFO - +Epoch 27 +2024-05-27 20:26:05,664 - INFO - [0/363571] Loss : 1.4646 +2024-05-27 20:26:40,320 - INFO - Test loss : 0.8228 +2024-05-27 20:26:40,320 - INFO - +Epoch 28 +2024-05-27 20:26:44,686 - INFO - [0/363571] Loss : 1.4612 +2024-05-27 20:27:18,704 - INFO - Test loss : 0.8392 +2024-05-27 20:27:18,704 - INFO - +Epoch 29 +2024-05-27 20:27:23,055 - INFO - [0/363571] Loss : 1.4617 +2024-05-27 20:27:57,613 - INFO - Test loss : 0.8586 +2024-05-27 20:27:57,613 - INFO - +Epoch 30 +2024-05-27 20:28:02,012 - INFO - [0/363571] Loss : 1.4497 +2024-05-27 20:28:37,305 - INFO - Test loss : 0.8608 +2024-05-27 20:28:38,376 - INFO - (53041,) +2024-05-27 20:28:38,539 - INFO - Split ID: 0 +2024-05-27 20:28:38,561 - INFO - Top 1 LocEnc (Epoch 30)acc (%): 15.73 +2024-05-27 20:28:38,563 - INFO - Top 3 LocEnc (Epoch 30)acc (%): 27.19 +2024-05-27 20:28:38,565 - INFO - Top 5 LocEnc (Epoch 30)acc (%): 34.39 +2024-05-27 20:28:38,566 - INFO - Top 10 LocEnc (Epoch 30)acc (%): 47.75 +2024-05-27 20:28:38,573 - INFO - +No prior +2024-05-27 20:28:38,578 - INFO - (53041,) +2024-05-27 20:28:38,740 - INFO - Split ID: 0 +2024-05-27 20:28:38,740 - INFO - Top 1 (Epoch 30)acc (%): 69.83 +2024-05-27 20:28:38,740 - INFO - Top 3 (Epoch 30)acc (%): 84.61 +2024-05-27 20:28:38,740 - INFO - Top 5 (Epoch 30)acc (%): 89.23 +2024-05-27 20:28:38,740 - INFO - Top 10 (Epoch 30)acc (%): 94.29 +2024-05-27 20:29:20,776 - INFO - Split ID: 0 +2024-05-27 20:29:20,776 - INFO - Top 1 (Epoch 30)acc (%): 70.29 +2024-05-27 20:29:20,776 - INFO - Top 3 (Epoch 30)acc (%): 84.98 +2024-05-27 20:29:20,777 - INFO - Top 5 (Epoch 30)acc (%): 89.58 +2024-05-27 20:29:20,777 - INFO - Top 10 (Epoch 30)acc (%): 94.52 +2024-05-27 20:29:20,777 - INFO - +Epoch 31 +2024-05-27 20:29:25,087 - INFO - [0/363571] Loss : 1.4370 +2024-05-27 20:29:59,482 - INFO - Test loss : 0.8477 +2024-05-27 20:29:59,483 - INFO - +Epoch 32 +2024-05-27 20:30:03,921 - INFO - [0/363571] Loss : 1.4480 +2024-05-27 20:30:37,311 - INFO - Test loss : 0.8212 +2024-05-27 20:30:37,311 - INFO - +Epoch 33 +2024-05-27 20:30:41,875 - INFO - [0/363571] Loss : 1.4147 +2024-05-27 20:31:16,430 - INFO - Test loss : 0.8026 +2024-05-27 20:31:16,430 - INFO - +Epoch 34 +2024-05-27 20:31:20,674 - INFO - [0/363571] Loss : 1.4281 +2024-05-27 20:31:33,953 - INFO - Test loss : 0.7965 +2024-05-27 20:31:33,953 - INFO - +Epoch 35 +2024-05-27 20:31:34,480 - INFO - [0/363571] Loss : 1.4137 +2024-05-27 20:31:38,096 - INFO - Test loss : 0.8074 +2024-05-27 20:31:38,096 - INFO - +Epoch 36 +2024-05-27 20:31:38,602 - INFO - [0/363571] Loss : 1.4113 +2024-05-27 20:31:55,134 - INFO - Test loss : 0.8230 +2024-05-27 20:31:55,135 - INFO - +Epoch 37 +2024-05-27 20:31:57,746 - INFO - [0/363571] Loss : 1.4041 +2024-05-27 20:32:14,081 - INFO - Test loss : 0.8340 +2024-05-27 20:32:14,081 - INFO - +Epoch 38 +2024-05-27 20:32:15,796 - INFO - [0/363571] Loss : 1.4164 +2024-05-27 20:32:33,562 - INFO - Test loss : 0.8268 +2024-05-27 20:32:33,562 - INFO - +Epoch 39 +2024-05-27 20:32:35,821 - INFO - [0/363571] Loss : 1.3905 +2024-05-27 20:32:57,692 - INFO - Test loss : 0.8114 +2024-05-27 20:32:57,693 - INFO - +Epoch 40 +2024-05-27 20:33:02,148 - INFO - [0/363571] Loss : 1.3922 +2024-05-27 20:33:38,743 - INFO - Test loss : 0.7931 +2024-05-27 20:33:39,525 - INFO - (53041,) +2024-05-27 20:33:39,688 - INFO - Split ID: 0 +2024-05-27 20:33:39,712 - INFO - Top 1 LocEnc (Epoch 40)acc (%): 16.29 +2024-05-27 20:33:39,714 - INFO - Top 3 LocEnc (Epoch 40)acc (%): 27.11 +2024-05-27 20:33:39,716 - INFO - Top 5 LocEnc (Epoch 40)acc (%): 33.93 +2024-05-27 20:33:39,718 - INFO - Top 10 LocEnc (Epoch 40)acc (%): 46.71 +2024-05-27 20:33:39,724 - INFO - +No prior +2024-05-27 20:33:39,729 - INFO - (53041,) +2024-05-27 20:33:39,913 - INFO - Split ID: 0 +2024-05-27 20:33:39,913 - INFO - Top 1 (Epoch 40)acc (%): 69.83 +2024-05-27 20:33:39,913 - INFO - Top 3 (Epoch 40)acc (%): 84.61 +2024-05-27 20:33:39,913 - INFO - Top 5 (Epoch 40)acc (%): 89.23 +2024-05-27 20:33:39,914 - INFO - Top 10 (Epoch 40)acc (%): 94.29 +2024-05-27 20:34:24,387 - INFO - Split ID: 0 +2024-05-27 20:34:24,387 - INFO - Top 1 (Epoch 40)acc (%): 70.33 +2024-05-27 20:34:24,387 - INFO - Top 3 (Epoch 40)acc (%): 85.06 +2024-05-27 20:34:24,387 - INFO - Top 5 (Epoch 40)acc (%): 89.6 +2024-05-27 20:34:24,388 - INFO - Top 10 (Epoch 40)acc (%): 94.57 +2024-05-27 20:34:24,388 - INFO - +Epoch 41 +2024-05-27 20:34:28,957 - INFO - [0/363571] Loss : 1.3843 +2024-05-27 20:35:07,789 - INFO - Test loss : 0.7775 +2024-05-27 20:35:07,789 - INFO - +Epoch 42 +2024-05-27 20:35:12,412 - INFO - [0/363571] Loss : 1.3674 +2024-05-27 20:35:50,752 - INFO - Test loss : 0.7714 +2024-05-27 20:35:50,752 - INFO - +Epoch 43 +2024-05-27 20:35:55,346 - INFO - [0/363571] Loss : 1.3619 +2024-05-27 20:36:34,167 - INFO - Test loss : 0.7758 +2024-05-27 20:36:34,168 - INFO - +Epoch 44 +2024-05-27 20:36:38,786 - INFO - [0/363571] Loss : 1.3607 +2024-05-27 20:37:15,488 - INFO - Test loss : 0.7864 +2024-05-27 20:37:15,488 - INFO - +Epoch 45 +2024-05-27 20:37:20,152 - INFO - [0/363571] Loss : 1.3740 +2024-05-27 20:37:59,009 - INFO - Test loss : 0.7897 +2024-05-27 20:37:59,009 - INFO - +Epoch 46 +2024-05-27 20:38:03,656 - INFO - [0/363571] Loss : 1.3605 +2024-05-27 20:38:42,450 - INFO - Test loss : 0.7883 +2024-05-27 20:38:42,451 - INFO - +Epoch 47 +2024-05-27 20:38:46,944 - INFO - [0/363571] Loss : 1.3687 +2024-05-27 20:39:23,603 - INFO - Test loss : 0.7788 +2024-05-27 20:39:23,603 - INFO - +Epoch 48 +2024-05-27 20:39:28,237 - INFO - [0/363571] Loss : 1.3516 +2024-05-27 20:40:07,198 - INFO - Test loss : 0.7681 +2024-05-27 20:40:07,198 - INFO - +Epoch 49 +2024-05-27 20:40:11,069 - INFO - [0/363571] Loss : 1.3552 +2024-05-27 20:40:46,392 - INFO - Test loss : 0.7590 +2024-05-27 20:40:46,402 - INFO - +Epoch 50 +2024-05-27 20:40:50,547 - INFO - [0/363571] Loss : 1.3521 +2024-05-27 20:41:28,728 - INFO - Test loss : 0.7542 +2024-05-27 20:41:29,449 - INFO - (53041,) +2024-05-27 20:41:29,633 - INFO - Split ID: 0 +2024-05-27 20:41:29,661 - INFO - Top 1 LocEnc (Epoch 50)acc (%): 16.3 +2024-05-27 20:41:29,664 - INFO - Top 3 LocEnc (Epoch 50)acc (%): 28.59 +2024-05-27 20:41:29,667 - INFO - Top 5 LocEnc (Epoch 50)acc (%): 35.88 +2024-05-27 20:41:29,669 - INFO - Top 10 LocEnc (Epoch 50)acc (%): 49.63 +2024-05-27 20:41:29,677 - INFO - +No prior +2024-05-27 20:41:29,684 - INFO - (53041,) +2024-05-27 20:41:29,848 - INFO - Split ID: 0 +2024-05-27 20:41:29,848 - INFO - Top 1 (Epoch 50)acc (%): 69.83 +2024-05-27 20:41:29,848 - INFO - Top 3 (Epoch 50)acc (%): 84.61 +2024-05-27 20:41:29,848 - INFO - Top 5 (Epoch 50)acc (%): 89.23 +2024-05-27 20:41:29,848 - INFO - Top 10 (Epoch 50)acc (%): 94.29 +2024-05-27 20:42:12,464 - INFO - Split ID: 0 +2024-05-27 20:42:12,464 - INFO - Top 1 (Epoch 50)acc (%): 70.37 +2024-05-27 20:42:12,464 - INFO - Top 3 (Epoch 50)acc (%): 85.09 +2024-05-27 20:42:12,464 - INFO - Top 5 (Epoch 50)acc (%): 89.69 +2024-05-27 20:42:12,464 - INFO - Top 10 (Epoch 50)acc (%): 94.6 +2024-05-27 20:42:12,465 - INFO - +Epoch 51 +2024-05-27 20:42:16,725 - INFO - [0/363571] Loss : 1.3381 +2024-05-27 20:42:54,201 - INFO - Test loss : 0.7556 +2024-05-27 20:42:54,201 - INFO - +Epoch 52 +2024-05-27 20:42:58,696 - INFO - [0/363571] Loss : 1.3489 +2024-05-27 20:43:36,087 - INFO - Test loss : 0.7596 +2024-05-27 20:43:36,087 - INFO - +Epoch 53 +2024-05-27 20:43:40,666 - INFO - [0/363571] Loss : 1.3358 +2024-05-27 20:44:17,267 - INFO - Test loss : 0.7604 +2024-05-27 20:44:17,267 - INFO - +Epoch 54 +2024-05-27 20:44:21,956 - INFO - [0/363571] Loss : 1.3363 +2024-05-27 20:45:00,804 - INFO - Test loss : 0.7621 +2024-05-27 20:45:00,804 - INFO - +Epoch 55 +2024-05-27 20:45:05,501 - INFO - [0/363571] Loss : 1.3346 +2024-05-27 20:45:41,776 - INFO - Test loss : 0.7601 +2024-05-27 20:45:41,776 - INFO - +Epoch 56 +2024-05-27 20:45:46,479 - INFO - [0/363571] Loss : 1.3288 +2024-05-27 20:46:24,498 - INFO - Test loss : 0.7590 +2024-05-27 20:46:24,498 - INFO - +Epoch 57 +2024-05-27 20:46:29,082 - INFO - [0/363571] Loss : 1.3299 +2024-05-27 20:47:07,754 - INFO - Test loss : 0.7579 +2024-05-27 20:47:07,755 - INFO - +Epoch 58 +2024-05-27 20:47:12,351 - INFO - [0/363571] Loss : 1.3217 +2024-05-27 20:47:50,695 - INFO - Test loss : 0.7549 +2024-05-27 20:47:50,696 - INFO - +Epoch 59 +2024-05-27 20:47:55,318 - INFO - [0/363571] Loss : 1.3151 +2024-05-27 20:48:33,365 - INFO - Test loss : 0.7525 +2024-05-27 20:48:33,365 - INFO - +Epoch 60 +2024-05-27 20:48:38,036 - INFO - [0/363571] Loss : 1.3228 +2024-05-27 20:49:14,534 - INFO - Test loss : 0.7485 +2024-05-27 20:49:15,275 - INFO - (53041,) +2024-05-27 20:49:15,435 - INFO - Split ID: 0 +2024-05-27 20:49:15,455 - INFO - Top 1 LocEnc (Epoch 60)acc (%): 17.04 +2024-05-27 20:49:15,457 - INFO - Top 3 LocEnc (Epoch 60)acc (%): 29.33 +2024-05-27 20:49:15,459 - INFO - Top 5 LocEnc (Epoch 60)acc (%): 36.97 +2024-05-27 20:49:15,461 - INFO - Top 10 LocEnc (Epoch 60)acc (%): 50.53 +2024-05-27 20:49:15,468 - INFO - +No prior +2024-05-27 20:49:15,475 - INFO - (53041,) +2024-05-27 20:49:15,649 - INFO - Split ID: 0 +2024-05-27 20:49:15,649 - INFO - Top 1 (Epoch 60)acc (%): 69.83 +2024-05-27 20:49:15,650 - INFO - Top 3 (Epoch 60)acc (%): 84.61 +2024-05-27 20:49:15,650 - INFO - Top 5 (Epoch 60)acc (%): 89.23 +2024-05-27 20:49:15,650 - INFO - Top 10 (Epoch 60)acc (%): 94.29 +2024-05-27 20:49:58,872 - INFO - Split ID: 0 +2024-05-27 20:49:58,873 - INFO - Top 1 (Epoch 60)acc (%): 70.41 +2024-05-27 20:49:58,873 - INFO - Top 3 (Epoch 60)acc (%): 85.15 +2024-05-27 20:49:58,873 - INFO - Top 5 (Epoch 60)acc (%): 89.7 +2024-05-27 20:49:58,873 - INFO - Top 10 (Epoch 60)acc (%): 94.6 +2024-05-27 20:49:58,874 - INFO - +Epoch 61 +2024-05-27 20:50:03,456 - INFO - [0/363571] Loss : 1.3225 +2024-05-27 20:50:40,683 - INFO - Test loss : 0.7436 +2024-05-27 20:50:40,683 - INFO - +Epoch 62 +2024-05-27 20:50:45,325 - INFO - [0/363571] Loss : 1.3136 +2024-05-27 20:51:18,836 - INFO - Test loss : 0.7419 +2024-05-27 20:51:18,836 - INFO - +Epoch 63 +2024-05-27 20:51:23,481 - INFO - [0/363571] Loss : 1.3133 +2024-05-27 20:52:00,980 - INFO - Test loss : 0.7430 +2024-05-27 20:52:00,980 - INFO - +Epoch 64 +2024-05-27 20:52:05,514 - INFO - [0/363571] Loss : 1.3134 +2024-05-27 20:52:42,855 - INFO - Test loss : 0.7449 +2024-05-27 20:52:42,855 - INFO - +Epoch 65 +2024-05-27 20:52:47,626 - INFO - [0/363571] Loss : 1.3121 +2024-05-27 20:53:25,511 - INFO - Test loss : 0.7480 +2024-05-27 20:53:25,511 - INFO - +Epoch 66 +2024-05-27 20:53:30,093 - INFO - [0/363571] Loss : 1.3006 +2024-05-27 20:54:05,673 - INFO - Test loss : 0.7493 +2024-05-27 20:54:05,673 - INFO - +Epoch 67 +2024-05-27 20:54:10,263 - INFO - [0/363571] Loss : 1.3202 +2024-05-27 20:54:48,436 - INFO - Test loss : 0.7464 +2024-05-27 20:54:48,436 - INFO - +Epoch 68 +2024-05-27 20:54:53,183 - INFO - [0/363571] Loss : 1.2955 +2024-05-27 20:55:31,287 - INFO - Test loss : 0.7423 +2024-05-27 20:55:31,289 - INFO - +Epoch 69 +2024-05-27 20:55:34,796 - INFO - [0/363571] Loss : 1.2953 +2024-05-27 20:56:11,627 - INFO - Test loss : 0.7371 +2024-05-27 20:56:11,627 - INFO - +Epoch 70 +2024-05-27 20:56:16,336 - INFO - [0/363571] Loss : 1.2963 +2024-05-27 20:56:53,712 - INFO - Test loss : 0.7332 +2024-05-27 20:56:54,701 - INFO - (53041,) +2024-05-27 20:56:54,862 - INFO - Split ID: 0 +2024-05-27 20:56:54,885 - INFO - Top 1 LocEnc (Epoch 70)acc (%): 16.69 +2024-05-27 20:56:54,887 - INFO - Top 3 LocEnc (Epoch 70)acc (%): 29.52 +2024-05-27 20:56:54,889 - INFO - Top 5 LocEnc (Epoch 70)acc (%): 36.96 +2024-05-27 20:56:54,890 - INFO - Top 10 LocEnc (Epoch 70)acc (%): 50.46 +2024-05-27 20:56:54,897 - INFO - +No prior +2024-05-27 20:56:54,902 - INFO - (53041,) +2024-05-27 20:56:55,068 - INFO - Split ID: 0 +2024-05-27 20:56:55,068 - INFO - Top 1 (Epoch 70)acc (%): 69.83 +2024-05-27 20:56:55,068 - INFO - Top 3 (Epoch 70)acc (%): 84.61 +2024-05-27 20:56:55,068 - INFO - Top 5 (Epoch 70)acc (%): 89.23 +2024-05-27 20:56:55,069 - INFO - Top 10 (Epoch 70)acc (%): 94.29 +2024-05-27 20:57:38,218 - INFO - Split ID: 0 +2024-05-27 20:57:38,219 - INFO - Top 1 (Epoch 70)acc (%): 70.43 +2024-05-27 20:57:38,219 - INFO - Top 3 (Epoch 70)acc (%): 85.15 +2024-05-27 20:57:38,219 - INFO - Top 5 (Epoch 70)acc (%): 89.72 +2024-05-27 20:57:38,219 - INFO - Top 10 (Epoch 70)acc (%): 94.63 +2024-05-27 20:57:38,220 - INFO - +Epoch 71 +2024-05-27 20:57:42,678 - INFO - [0/363571] Loss : 1.3081 +2024-05-27 20:58:18,604 - INFO - Test loss : 0.7290 +2024-05-27 20:58:18,604 - INFO - +Epoch 72 +2024-05-27 20:58:23,330 - INFO - [0/363571] Loss : 1.3074 +2024-05-27 20:59:00,148 - INFO - Test loss : 0.7250 +2024-05-27 20:59:00,148 - INFO - +Epoch 73 +2024-05-27 20:59:04,484 - INFO - [0/363571] Loss : 1.3077 +2024-05-27 20:59:42,150 - INFO - Test loss : 0.7199 +2024-05-27 20:59:42,151 - INFO - +Epoch 74 +2024-05-27 20:59:46,689 - INFO - [0/363571] Loss : 1.3012 +2024-05-27 21:00:24,933 - INFO - Test loss : 0.7159 +2024-05-27 21:00:24,934 - INFO - +Epoch 75 +2024-05-27 21:00:29,499 - INFO - [0/363571] Loss : 1.3013 +2024-05-27 21:01:04,841 - INFO - Test loss : 0.7155 +2024-05-27 21:01:04,841 - INFO - +Epoch 76 +2024-05-27 21:01:09,365 - INFO - [0/363571] Loss : 1.2896 +2024-05-27 21:01:47,397 - INFO - Test loss : 0.7193 +2024-05-27 21:01:47,397 - INFO - +Epoch 77 +2024-05-27 21:01:51,651 - INFO - [0/363571] Loss : 1.2886 +2024-05-27 21:02:23,216 - INFO - Test loss : 0.7264 +2024-05-27 21:02:23,216 - INFO - +Epoch 78 +2024-05-27 21:02:26,252 - INFO - [0/363571] Loss : 1.2888 +2024-05-27 21:03:00,182 - INFO - Test loss : 0.7313 +2024-05-27 21:03:00,187 - INFO - +Epoch 79 +2024-05-27 21:03:04,663 - INFO - [0/363571] Loss : 1.2879 +2024-05-27 21:03:40,281 - INFO - Test loss : 0.7338 +2024-05-27 21:03:40,281 - INFO - +Epoch 80 +2024-05-27 21:03:44,618 - INFO - [0/363571] Loss : 1.2797 +2024-05-27 21:04:20,270 - INFO - Test loss : 0.7334 +2024-05-27 21:04:21,062 - INFO - (53041,) +2024-05-27 21:04:21,235 - INFO - Split ID: 0 +2024-05-27 21:04:21,256 - INFO - Top 1 LocEnc (Epoch 80)acc (%): 17.04 +2024-05-27 21:04:21,259 - INFO - Top 3 LocEnc (Epoch 80)acc (%): 29.68 +2024-05-27 21:04:21,262 - INFO - Top 5 LocEnc (Epoch 80)acc (%): 36.83 +2024-05-27 21:04:21,265 - INFO - Top 10 LocEnc (Epoch 80)acc (%): 51.0 +2024-05-27 21:04:21,274 - INFO - +No prior +2024-05-27 21:04:21,291 - INFO - (53041,) +2024-05-27 21:04:21,478 - INFO - Split ID: 0 +2024-05-27 21:04:21,478 - INFO - Top 1 (Epoch 80)acc (%): 69.83 +2024-05-27 21:04:21,478 - INFO - Top 3 (Epoch 80)acc (%): 84.61 +2024-05-27 21:04:21,479 - INFO - Top 5 (Epoch 80)acc (%): 89.23 +2024-05-27 21:04:21,479 - INFO - Top 10 (Epoch 80)acc (%): 94.29 +2024-05-27 21:05:03,658 - INFO - Split ID: 0 +2024-05-27 21:05:03,658 - INFO - Top 1 (Epoch 80)acc (%): 70.44 +2024-05-27 21:05:03,658 - INFO - Top 3 (Epoch 80)acc (%): 85.21 +2024-05-27 21:05:03,659 - INFO - Top 5 (Epoch 80)acc (%): 89.72 +2024-05-27 21:05:03,659 - INFO - Top 10 (Epoch 80)acc (%): 94.65 +2024-05-27 21:05:03,659 - INFO - +Epoch 81 +2024-05-27 21:05:07,311 - INFO - [0/363571] Loss : 1.2806 +2024-05-27 21:05:42,991 - INFO - Test loss : 0.7309 +2024-05-27 21:05:42,991 - INFO - +Epoch 82 +2024-05-27 21:05:47,226 - INFO - [0/363571] Loss : 1.2818 +2024-05-27 21:06:22,128 - INFO - Test loss : 0.7272 +2024-05-27 21:06:22,128 - INFO - +Epoch 83 +2024-05-27 21:06:26,511 - INFO - [0/363571] Loss : 1.2841 +2024-05-27 21:07:01,169 - INFO - Test loss : 0.7221 +2024-05-27 21:07:01,169 - INFO - +Epoch 84 +2024-05-27 21:07:05,523 - INFO - [0/363571] Loss : 1.2971 +2024-05-27 21:07:38,653 - INFO - Test loss : 0.7171 +2024-05-27 21:07:38,653 - INFO - +Epoch 85 +2024-05-27 21:07:40,994 - INFO - [0/363571] Loss : 1.2835 +2024-05-27 21:08:15,764 - INFO - Test loss : 0.7148 +2024-05-27 21:08:15,764 - INFO - +Epoch 86 +2024-05-27 21:08:19,751 - INFO - [0/363571] Loss : 1.2756 +2024-05-27 21:08:54,328 - INFO - Test loss : 0.7160 +2024-05-27 21:08:54,328 - INFO - +Epoch 87 +2024-05-27 21:08:57,768 - INFO - [0/363571] Loss : 1.2697 +2024-05-27 21:09:19,458 - INFO - Test loss : 0.7180 +2024-05-27 21:09:19,458 - INFO - +Epoch 88 +2024-05-27 21:09:22,997 - INFO - [0/363571] Loss : 1.2759 +2024-05-27 21:09:52,146 - INFO - Test loss : 0.7202 +2024-05-27 21:09:52,147 - INFO - +Epoch 89 +2024-05-27 21:09:55,717 - INFO - [0/363571] Loss : 1.2875 +2024-05-27 21:10:26,939 - INFO - Test loss : 0.7217 +2024-05-27 21:10:26,939 - INFO - +Epoch 90 +2024-05-27 21:10:31,552 - INFO - [0/363571] Loss : 1.2789 +2024-05-27 21:11:05,786 - INFO - Test loss : 0.7217 +2024-05-27 21:11:06,525 - INFO - (53041,) +2024-05-27 21:11:06,704 - INFO - Split ID: 0 +2024-05-27 21:11:06,731 - INFO - Top 1 LocEnc (Epoch 90)acc (%): 17.12 +2024-05-27 21:11:06,733 - INFO - Top 3 LocEnc (Epoch 90)acc (%): 30.36 +2024-05-27 21:11:06,736 - INFO - Top 5 LocEnc (Epoch 90)acc (%): 37.51 +2024-05-27 21:11:06,739 - INFO - Top 10 LocEnc (Epoch 90)acc (%): 51.53 +2024-05-27 21:11:06,746 - INFO - +No prior +2024-05-27 21:11:06,753 - INFO - (53041,) +2024-05-27 21:11:06,920 - INFO - Split ID: 0 +2024-05-27 21:11:06,921 - INFO - Top 1 (Epoch 90)acc (%): 69.83 +2024-05-27 21:11:06,921 - INFO - Top 3 (Epoch 90)acc (%): 84.61 +2024-05-27 21:11:06,921 - INFO - Top 5 (Epoch 90)acc (%): 89.23 +2024-05-27 21:11:06,921 - INFO - Top 10 (Epoch 90)acc (%): 94.29 +2024-05-27 21:11:49,204 - INFO - Split ID: 0 +2024-05-27 21:11:49,205 - INFO - Top 1 (Epoch 90)acc (%): 70.48 +2024-05-27 21:11:49,205 - INFO - Top 3 (Epoch 90)acc (%): 85.21 +2024-05-27 21:11:49,205 - INFO - Top 5 (Epoch 90)acc (%): 89.75 +2024-05-27 21:11:49,205 - INFO - Top 10 (Epoch 90)acc (%): 94.66 +2024-05-27 21:11:49,206 - INFO - +Epoch 91 +2024-05-27 21:11:53,039 - INFO - [0/363571] Loss : 1.2675 +2024-05-27 21:12:28,797 - INFO - Test loss : 0.7212 +2024-05-27 21:12:28,798 - INFO - +Epoch 92 +2024-05-27 21:12:32,736 - INFO - [0/363571] Loss : 1.2723 +2024-05-27 21:13:07,780 - INFO - Test loss : 0.7200 +2024-05-27 21:13:07,781 - INFO - +Epoch 93 +2024-05-27 21:13:11,337 - INFO - [0/363571] Loss : 1.2765 +2024-05-27 21:13:42,772 - INFO - Test loss : 0.7176 +2024-05-27 21:13:42,773 - INFO - +Epoch 94 +2024-05-27 21:13:47,118 - INFO - [0/363571] Loss : 1.2771 +2024-05-27 21:14:20,786 - INFO - Test loss : 0.7140 +2024-05-27 21:14:20,786 - INFO - +Epoch 95 +2024-05-27 21:14:25,536 - INFO - [0/363571] Loss : 1.2695 +2024-05-27 21:14:54,028 - INFO - Test loss : 0.7109 +2024-05-27 21:14:54,028 - INFO - +Epoch 96 +2024-05-27 21:14:56,764 - INFO - [0/363571] Loss : 1.2695 +2024-05-27 21:15:17,545 - INFO - Test loss : 0.7084 +2024-05-27 21:15:17,546 - INFO - +Epoch 97 +2024-05-27 21:15:20,207 - INFO - [0/363571] Loss : 1.2668 +2024-05-27 21:15:35,160 - INFO - Test loss : 0.7076 +2024-05-27 21:15:35,160 - INFO - +Epoch 98 +2024-05-27 21:15:37,616 - INFO - [0/363571] Loss : 1.2733 +2024-05-27 21:15:56,172 - INFO - Test loss : 0.7068 +2024-05-27 21:15:56,172 - INFO - +Epoch 99 +2024-05-27 21:15:58,747 - INFO - [0/363571] Loss : 1.2636 +2024-05-27 21:16:14,274 - INFO - Test loss : 0.7065 +2024-05-27 21:16:14,274 - INFO - +Epoch 100 +2024-05-27 21:16:18,775 - INFO - [0/363571] Loss : 1.2647 +2024-05-27 21:16:56,515 - INFO - Test loss : 0.7082 +2024-05-27 21:16:57,224 - INFO - (53041,) +2024-05-27 21:16:57,404 - INFO - Split ID: 0 +2024-05-27 21:16:57,423 - INFO - Top 1 LocEnc (Epoch 100)acc (%): 16.95 +2024-05-27 21:16:57,425 - INFO - Top 3 LocEnc (Epoch 100)acc (%): 30.57 +2024-05-27 21:16:57,427 - INFO - Top 5 LocEnc (Epoch 100)acc (%): 37.54 +2024-05-27 21:16:57,429 - INFO - Top 10 LocEnc (Epoch 100)acc (%): 51.65 +2024-05-27 21:16:57,436 - INFO - +No prior +2024-05-27 21:16:57,443 - INFO - (53041,) +2024-05-27 21:16:57,610 - INFO - Split ID: 0 +2024-05-27 21:16:57,611 - INFO - Top 1 (Epoch 100)acc (%): 69.83 +2024-05-27 21:16:57,611 - INFO - Top 3 (Epoch 100)acc (%): 84.61 +2024-05-27 21:16:57,611 - INFO - Top 5 (Epoch 100)acc (%): 89.23 +2024-05-27 21:16:57,611 - INFO - Top 10 (Epoch 100)acc (%): 94.29 +2024-05-27 21:17:41,686 - INFO - Split ID: 0 +2024-05-27 21:17:41,687 - INFO - Top 1 (Epoch 100)acc (%): 70.49 +2024-05-27 21:17:41,687 - INFO - Top 3 (Epoch 100)acc (%): 85.2 +2024-05-27 21:17:41,687 - INFO - Top 5 (Epoch 100)acc (%): 89.75 +2024-05-27 21:17:41,687 - INFO - Top 10 (Epoch 100)acc (%): 94.66 +2024-05-27 21:17:41,688 - INFO - +Epoch 101 +2024-05-27 21:17:46,342 - INFO - [0/363571] Loss : 1.2664 +2024-05-27 21:18:24,145 - INFO - Test loss : 0.7103 +2024-05-27 21:18:24,146 - INFO - +Epoch 102 +2024-05-27 21:18:28,833 - INFO - [0/363571] Loss : 1.2578 +2024-05-27 21:19:07,637 - INFO - Test loss : 0.7127 +2024-05-27 21:19:07,637 - INFO - +Epoch 103 +2024-05-27 21:19:12,204 - INFO - [0/363571] Loss : 1.2671 +2024-05-27 21:19:50,553 - INFO - Test loss : 0.7136 +2024-05-27 21:19:50,553 - INFO - +Epoch 104 +2024-05-27 21:19:55,220 - INFO - [0/363571] Loss : 1.2524 +2024-05-27 21:20:34,207 - INFO - Test loss : 0.7127 +2024-05-27 21:20:34,207 - INFO - +Epoch 105 +2024-05-27 21:20:38,962 - INFO - [0/363571] Loss : 1.2732 +2024-05-27 21:21:17,560 - INFO - Test loss : 0.7092 +2024-05-27 21:21:17,560 - INFO - +Epoch 106 +2024-05-27 21:21:22,282 - INFO - [0/363571] Loss : 1.2566 +2024-05-27 21:22:01,309 - INFO - Test loss : 0.7069 +2024-05-27 21:22:01,309 - INFO - +Epoch 107 +2024-05-27 21:22:05,975 - INFO - [0/363571] Loss : 1.2590 +2024-05-27 21:22:43,540 - INFO - Test loss : 0.7051 +2024-05-27 21:22:43,540 - INFO - +Epoch 108 +2024-05-27 21:22:48,187 - INFO - [0/363571] Loss : 1.2600 +2024-05-27 21:23:22,208 - INFO - Test loss : 0.7049 +2024-05-27 21:23:22,208 - INFO - +Epoch 109 +2024-05-27 21:23:26,805 - INFO - [0/363571] Loss : 1.2602 +2024-05-27 21:24:05,410 - INFO - Test loss : 0.7055 +2024-05-27 21:24:05,410 - INFO - +Epoch 110 +2024-05-27 21:24:10,022 - INFO - [0/363571] Loss : 1.2634 +2024-05-27 21:24:48,422 - INFO - Test loss : 0.7049 +2024-05-27 21:24:49,324 - INFO - (53041,) +2024-05-27 21:24:49,502 - INFO - Split ID: 0 +2024-05-27 21:24:49,524 - INFO - Top 1 LocEnc (Epoch 110)acc (%): 17.12 +2024-05-27 21:24:49,526 - INFO - Top 3 LocEnc (Epoch 110)acc (%): 30.49 +2024-05-27 21:24:49,528 - INFO - Top 5 LocEnc (Epoch 110)acc (%): 37.68 +2024-05-27 21:24:49,530 - INFO - Top 10 LocEnc (Epoch 110)acc (%): 52.08 +2024-05-27 21:24:49,538 - INFO - +No prior +2024-05-27 21:24:49,545 - INFO - (53041,) +2024-05-27 21:24:49,711 - INFO - Split ID: 0 +2024-05-27 21:24:49,711 - INFO - Top 1 (Epoch 110)acc (%): 69.83 +2024-05-27 21:24:49,711 - INFO - Top 3 (Epoch 110)acc (%): 84.61 +2024-05-27 21:24:49,711 - INFO - Top 5 (Epoch 110)acc (%): 89.23 +2024-05-27 21:24:49,711 - INFO - Top 10 (Epoch 110)acc (%): 94.29 +2024-05-27 21:25:32,054 - INFO - Split ID: 0 +2024-05-27 21:25:32,055 - INFO - Top 1 (Epoch 110)acc (%): 70.48 +2024-05-27 21:25:32,055 - INFO - Top 3 (Epoch 110)acc (%): 85.2 +2024-05-27 21:25:32,055 - INFO - Top 5 (Epoch 110)acc (%): 89.74 +2024-05-27 21:25:32,055 - INFO - Top 10 (Epoch 110)acc (%): 94.66 +2024-05-27 21:25:32,056 - INFO - +Epoch 111 +2024-05-27 21:25:36,841 - INFO - [0/363571] Loss : 1.2592 +2024-05-27 21:26:15,776 - INFO - Test loss : 0.7041 +2024-05-27 21:26:15,776 - INFO - +Epoch 112 +2024-05-27 21:26:20,348 - INFO - [0/363571] Loss : 1.2448 +2024-05-27 21:26:59,238 - INFO - Test loss : 0.7041 +2024-05-27 21:26:59,238 - INFO - +Epoch 113 +2024-05-27 21:27:03,926 - INFO - [0/363571] Loss : 1.2457 +2024-05-27 21:27:42,245 - INFO - Test loss : 0.7049 +2024-05-27 21:27:42,245 - INFO - +Epoch 114 +2024-05-27 21:27:46,882 - INFO - [0/363571] Loss : 1.2649 +2024-05-27 21:28:23,688 - INFO - Test loss : 0.7046 +2024-05-27 21:28:23,688 - INFO - +Epoch 115 +2024-05-27 21:28:28,352 - INFO - [0/363571] Loss : 1.2477 +2024-05-27 21:29:07,186 - INFO - Test loss : 0.7046 +2024-05-27 21:29:07,186 - INFO - +Epoch 116 +2024-05-27 21:29:11,869 - INFO - [0/363571] Loss : 1.2548 +2024-05-27 21:29:50,104 - INFO - Test loss : 0.7048 +2024-05-27 21:29:50,104 - INFO - +Epoch 117 +2024-05-27 21:29:54,298 - INFO - [0/363571] Loss : 1.2566 +2024-05-27 21:30:33,229 - INFO - Test loss : 0.7054 +2024-05-27 21:30:33,229 - INFO - +Epoch 118 +2024-05-27 21:30:38,031 - INFO - [0/363571] Loss : 1.2570 +2024-05-27 21:31:15,902 - INFO - Test loss : 0.7059 +2024-05-27 21:31:15,902 - INFO - +Epoch 119 +2024-05-27 21:31:20,502 - INFO - [0/363571] Loss : 1.2498 +2024-05-27 21:31:58,864 - INFO - Test loss : 0.7061 +2024-05-27 21:31:58,864 - INFO - Saving output model to ../models/space2vec_theory/model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-27 21:31:58,878 - INFO - Saving output model to ../models/space2vec_theory/model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-27 21:31:58,898 - INFO - +No prior +2024-05-27 21:31:58,903 - INFO - (53041,) +2024-05-27 21:31:59,070 - INFO - Split ID: 0 +2024-05-27 21:31:59,070 - INFO - Top 1 acc (%): 69.83 +2024-05-27 21:31:59,074 - INFO - Top 3 acc (%): 84.61 +2024-05-27 21:31:59,074 - INFO - Top 5 acc (%): 89.23 +2024-05-27 21:31:59,074 - INFO - Top 10 acc (%): 94.29 +2024-05-27 21:32:41,398 - INFO - Split ID: 0 +2024-05-27 21:32:41,399 - INFO - Top 1 acc (%): 70.49 +2024-05-27 21:32:41,399 - INFO - Top 3 acc (%): 85.2 +2024-05-27 21:32:41,399 - INFO - Top 5 acc (%): 89.76 +2024-05-27 21:32:41,399 - INFO - Top 10 acc (%): 94.66 +2024-05-27 21:32:41,400 - INFO - +Space2Vec-theory +2024-05-27 21:32:41,400 - INFO - Model : model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-27 21:32:42,384 - INFO - (53041,) +2024-05-27 21:32:42,548 - INFO - Split ID: 0 +2024-05-27 21:32:42,572 - INFO - Top 1 LocEnc acc (%): 17.18 +2024-05-27 21:32:42,574 - INFO - Top 3 LocEnc acc (%): 30.3 +2024-05-27 21:32:42,576 - INFO - Top 5 LocEnc acc (%): 38.01 +2024-05-27 21:32:42,578 - INFO - Top 10 LocEnc acc (%): 52.16 +2024-05-31 01:56:28,297 - INFO - +num_classes 62 +2024-05-31 01:56:28,315 - INFO - num train 363571 +2024-05-31 01:56:28,315 - INFO - num val 53041 +2024-05-31 01:56:28,315 - INFO - train loss full_loss +2024-05-31 01:56:28,315 - INFO - model name ../models/space2vec_theory/model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 01:56:28,315 - INFO - num users 1 +2024-05-31 01:56:29,213 - INFO - +Only Space2Vec-theory +2024-05-31 01:56:29,213 - INFO - Model : model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 01:56:29,268 - INFO - Saving output model to ../models/space2vec_theory/model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 01:56:29,298 - INFO - Saving output model to ../models/space2vec_theory/model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 01:56:29,317 - INFO - +No prior +2024-05-31 01:56:29,327 - INFO - (53041,) +2024-05-31 01:56:30,251 - INFO - Save results to ../eval_results/eval_fmow__val_no_prior.csv +2024-05-31 01:56:30,251 - INFO - Split ID: 0 +2024-05-31 01:56:30,251 - INFO - Top 1 acc (%): 69.83 +2024-05-31 01:56:30,251 - INFO - Top 3 acc (%): 84.61 +2024-05-31 01:56:30,252 - INFO - Top 5 acc (%): 89.23 +2024-05-31 01:56:30,252 - INFO - Top 10 acc (%): 94.29 +2024-05-31 01:56:55,361 - INFO - Split ID: 0 +2024-05-31 01:56:55,361 - INFO - Top 1 hit (%): 70.49 +2024-05-31 01:56:55,362 - INFO - Top 3 hit (%): 85.2 +2024-05-31 01:56:55,362 - INFO - Top 5 hit (%): 89.76 +2024-05-31 01:56:55,362 - INFO - Top 10 hit (%): 94.66 +2024-05-31 01:56:55,375 - INFO - +Only Space2Vec-theory +2024-05-31 01:56:55,375 - INFO - Model : model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 01:56:55,908 - INFO - (53041,) +2024-05-31 01:56:56,069 - INFO - Split ID: 0 +2024-05-31 01:56:56,086 - INFO - Top 1 LocEnc acc (%): 17.18 +2024-05-31 01:56:56,088 - INFO - Top 3 LocEnc acc (%): 30.3 +2024-05-31 01:56:56,089 - INFO - Top 5 LocEnc acc (%): 38.01 +2024-05-31 01:56:56,091 - INFO - Top 10 LocEnc acc (%): 52.16 diff --git a/pre_trained_models/space2vec_theory/model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.pth.tar b/pre_trained_models/space2vec_theory/model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.pth.tar new file mode 100755 index 00000000..f95969df Binary files /dev/null and b/pre_trained_models/space2vec_theory/model_fmow_Space2Vec-theory_inception_v3_0.0200_64_0.0001000_360.000_1_512_BATCH8192_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_theory/model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.log b/pre_trained_models/space2vec_theory/model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..baed5d64 --- /dev/null +++ b/pre_trained_models/space2vec_theory/model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.log @@ -0,0 +1,737 @@ +2024-05-26 10:41:08,820 - INFO - +num_classes 5089 +2024-05-26 10:41:08,820 - INFO - num train 569465 +2024-05-26 10:41:08,820 - INFO - num val 93622 +2024-05-26 10:41:08,820 - INFO - train loss full_loss +2024-05-26 10:41:08,820 - INFO - model name ../models/space2vec_theory/model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-26 10:41:08,820 - INFO - num users 17302 +2024-05-26 10:41:10,004 - INFO - +Epoch 0 +2024-05-26 10:42:28,665 - INFO - [204800/569465] Loss : 1.3245 +2024-05-26 10:42:50,819 - INFO - [266240/569465] Loss : 1.2199 +2024-05-26 10:43:24,732 - INFO - Test loss : 0.3533 +2024-05-26 10:43:24,732 - INFO - +Epoch 1 +2024-05-26 10:44:42,775 - INFO - [204800/569465] Loss : 0.7504 +2024-05-26 10:44:57,783 - INFO - [266240/569465] Loss : 0.7387 +2024-05-26 10:45:31,118 - INFO - Test loss : 0.3341 +2024-05-26 10:45:31,118 - INFO - +Epoch 2 +2024-05-26 10:46:53,326 - INFO - [204800/569465] Loss : 0.6585 +2024-05-26 10:47:15,792 - INFO - [266240/569465] Loss : 0.6551 +2024-05-26 10:47:53,086 - INFO - Test loss : 0.3614 +2024-05-26 10:47:53,086 - INFO - +Epoch 3 +2024-05-26 10:49:09,242 - INFO - [204800/569465] Loss : 0.6119 +2024-05-26 10:49:31,097 - INFO - [266240/569465] Loss : 0.6099 +2024-05-26 10:50:05,721 - INFO - Test loss : 0.3907 +2024-05-26 10:50:05,721 - INFO - +Epoch 4 +2024-05-26 10:51:08,069 - INFO - [204800/569465] Loss : 0.5803 +2024-05-26 10:51:19,608 - INFO - [266240/569465] Loss : 0.5796 +2024-05-26 10:51:52,449 - INFO - Test loss : 0.4171 +2024-05-26 10:51:52,449 - INFO - +Epoch 5 +2024-05-26 10:53:02,547 - INFO - [204800/569465] Loss : 0.5590 +2024-05-26 10:53:25,796 - INFO - [266240/569465] Loss : 0.5592 +2024-05-26 10:53:56,219 - INFO - Test loss : 0.4333 +2024-05-26 10:53:56,219 - INFO - +Epoch 6 +2024-05-26 10:55:11,558 - INFO - [204800/569465] Loss : 0.5412 +2024-05-26 10:55:30,430 - INFO - [266240/569465] Loss : 0.5422 +2024-05-26 10:56:03,868 - INFO - Test loss : 0.4722 +2024-05-26 10:56:03,868 - INFO - +Epoch 7 +2024-05-26 10:57:15,530 - INFO - [204800/569465] Loss : 0.5276 +2024-05-26 10:57:37,517 - INFO - [266240/569465] Loss : 0.5288 +2024-05-26 10:58:05,768 - INFO - Test loss : 0.5176 +2024-05-26 10:58:05,768 - INFO - +Epoch 8 +2024-05-26 10:59:14,106 - INFO - [204800/569465] Loss : 0.5163 +2024-05-26 10:59:37,886 - INFO - [266240/569465] Loss : 0.5172 +2024-05-26 11:00:05,773 - INFO - Test loss : 0.5314 +2024-05-26 11:00:05,773 - INFO - +Epoch 9 +2024-05-26 11:01:18,706 - INFO - [204800/569465] Loss : 0.5077 +2024-05-26 11:01:37,150 - INFO - [266240/569465] Loss : 0.5086 +2024-05-26 11:02:05,876 - INFO - Test loss : 0.5378 +2024-05-26 11:02:05,876 - INFO - +Epoch 10 +2024-05-26 11:03:24,483 - INFO - [204800/569465] Loss : 0.4991 +2024-05-26 11:03:45,139 - INFO - [266240/569465] Loss : 0.5002 +2024-05-26 11:04:18,216 - INFO - Test loss : 0.5503 +2024-05-26 11:04:21,880 - INFO - (93622,) +2024-05-26 11:05:17,339 - INFO - Split ID: 0 +2024-05-26 11:05:17,385 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 0.99 +2024-05-26 11:05:17,389 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 2.57 +2024-05-26 11:05:17,392 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 3.94 +2024-05-26 11:05:17,396 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 7.08 +2024-05-26 11:05:17,676 - INFO - +No prior +2024-05-26 11:05:18,303 - INFO - (95986,) +2024-05-26 11:05:41,704 - INFO - Split ID: 0 +2024-05-26 11:05:41,705 - INFO - Top 1 (Epoch 10)acc (%): 63.27 +2024-05-26 11:05:41,705 - INFO - Top 3 (Epoch 10)acc (%): 79.82 +2024-05-26 11:05:41,705 - INFO - Top 5 (Epoch 10)acc (%): 84.51 +2024-05-26 11:05:41,706 - INFO - Top 10 (Epoch 10)acc (%): 88.99 +2024-05-26 11:06:58,730 - INFO - Split ID: 0 +2024-05-26 11:06:58,730 - INFO - Top 1 (Epoch 10)acc (%): 69.41 +2024-05-26 11:06:58,731 - INFO - Top 3 (Epoch 10)acc (%): 84.57 +2024-05-26 11:06:58,731 - INFO - Top 5 (Epoch 10)acc (%): 88.28 +2024-05-26 11:06:58,731 - INFO - Top 10 (Epoch 10)acc (%): 91.86 +2024-05-26 11:06:58,734 - INFO - +Epoch 11 +2024-05-26 11:08:15,336 - INFO - [204800/569465] Loss : 0.4910 +2024-05-26 11:08:37,600 - INFO - [266240/569465] Loss : 0.4919 +2024-05-26 11:09:14,437 - INFO - Test loss : 0.5816 +2024-05-26 11:09:14,437 - INFO - +Epoch 12 +2024-05-26 11:10:33,396 - INFO - [204800/569465] Loss : 0.4848 +2024-05-26 11:10:53,132 - INFO - [266240/569465] Loss : 0.4862 +2024-05-26 11:11:27,517 - INFO - Test loss : 0.5938 +2024-05-26 11:11:27,517 - INFO - +Epoch 13 +2024-05-26 11:12:44,489 - INFO - [204800/569465] Loss : 0.4783 +2024-05-26 11:13:04,291 - INFO - [266240/569465] Loss : 0.4798 +2024-05-26 11:13:31,122 - INFO - Test loss : 0.6317 +2024-05-26 11:13:31,122 - INFO - +Epoch 14 +2024-05-26 11:14:28,062 - INFO - [204800/569465] Loss : 0.4745 +2024-05-26 11:14:51,791 - INFO - [266240/569465] Loss : 0.4755 +2024-05-26 11:15:24,223 - INFO - Test loss : 0.6254 +2024-05-26 11:15:24,223 - INFO - +Epoch 15 +2024-05-26 11:16:37,165 - INFO - [204800/569465] Loss : 0.4686 +2024-05-26 11:16:56,260 - INFO - [266240/569465] Loss : 0.4701 +2024-05-26 11:17:22,728 - INFO - Test loss : 0.6403 +2024-05-26 11:17:22,729 - INFO - +Epoch 16 +2024-05-26 11:18:39,881 - INFO - [204800/569465] Loss : 0.4652 +2024-05-26 11:19:03,283 - INFO - [266240/569465] Loss : 0.4671 +2024-05-26 11:19:35,072 - INFO - Test loss : 0.6421 +2024-05-26 11:19:35,072 - INFO - +Epoch 17 +2024-05-26 11:20:53,141 - INFO - [204800/569465] Loss : 0.4611 +2024-05-26 11:21:15,339 - INFO - [266240/569465] Loss : 0.4624 +2024-05-26 11:21:50,603 - INFO - Test loss : 0.6684 +2024-05-26 11:21:50,603 - INFO - +Epoch 18 +2024-05-26 11:23:05,649 - INFO - [204800/569465] Loss : 0.4562 +2024-05-26 11:23:24,580 - INFO - [266240/569465] Loss : 0.4575 +2024-05-26 11:23:57,424 - INFO - Test loss : 0.6571 +2024-05-26 11:23:57,424 - INFO - +Epoch 19 +2024-05-26 11:25:19,064 - INFO - [204800/569465] Loss : 0.4533 +2024-05-26 11:25:40,572 - INFO - [266240/569465] Loss : 0.4545 +2024-05-26 11:26:10,294 - INFO - Test loss : 0.7156 +2024-05-26 11:26:10,294 - INFO - +Epoch 20 +2024-05-26 11:27:20,964 - INFO - [204800/569465] Loss : 0.4502 +2024-05-26 11:27:43,515 - INFO - [266240/569465] Loss : 0.4518 +2024-05-26 11:27:57,052 - INFO - Test loss : 0.6976 +2024-05-26 11:28:00,598 - INFO - (93622,) +2024-05-26 11:28:58,073 - INFO - Split ID: 0 +2024-05-26 11:28:58,119 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 1.17 +2024-05-26 11:28:58,122 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 2.94 +2024-05-26 11:28:58,126 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 4.47 +2024-05-26 11:28:58,129 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 7.73 +2024-05-26 11:28:58,392 - INFO - +No prior +2024-05-26 11:28:59,064 - INFO - (95986,) +2024-05-26 11:29:20,712 - INFO - Split ID: 0 +2024-05-26 11:29:20,713 - INFO - Top 1 (Epoch 20)acc (%): 63.27 +2024-05-26 11:29:20,713 - INFO - Top 3 (Epoch 20)acc (%): 79.82 +2024-05-26 11:29:20,713 - INFO - Top 5 (Epoch 20)acc (%): 84.51 +2024-05-26 11:29:20,713 - INFO - Top 10 (Epoch 20)acc (%): 88.99 +2024-05-26 11:30:38,819 - INFO - Split ID: 0 +2024-05-26 11:30:38,819 - INFO - Top 1 (Epoch 20)acc (%): 69.26 +2024-05-26 11:30:38,819 - INFO - Top 3 (Epoch 20)acc (%): 84.31 +2024-05-26 11:30:38,819 - INFO - Top 5 (Epoch 20)acc (%): 88.1 +2024-05-26 11:30:38,819 - INFO - Top 10 (Epoch 20)acc (%): 91.63 +2024-05-26 11:30:38,848 - INFO - +Epoch 21 +2024-05-26 11:32:03,009 - INFO - [204800/569465] Loss : 0.4460 +2024-05-26 11:32:26,504 - INFO - [266240/569465] Loss : 0.4475 +2024-05-26 11:32:59,149 - INFO - Test loss : 0.7104 +2024-05-26 11:32:59,150 - INFO - +Epoch 22 +2024-05-26 11:34:14,657 - INFO - [204800/569465] Loss : 0.4435 +2024-05-26 11:34:37,276 - INFO - [266240/569465] Loss : 0.4447 +2024-05-26 11:35:03,604 - INFO - Test loss : 0.7193 +2024-05-26 11:35:03,604 - INFO - +Epoch 23 +2024-05-26 11:36:13,473 - INFO - [204800/569465] Loss : 0.4410 +2024-05-26 11:36:21,477 - INFO - [266240/569465] Loss : 0.4426 +2024-05-26 11:36:55,236 - INFO - Test loss : 0.7298 +2024-05-26 11:36:55,236 - INFO - +Epoch 24 +2024-05-26 11:38:10,859 - INFO - [204800/569465] Loss : 0.4378 +2024-05-26 11:38:33,252 - INFO - [266240/569465] Loss : 0.4387 +2024-05-26 11:39:04,740 - INFO - Test loss : 0.7635 +2024-05-26 11:39:04,740 - INFO - +Epoch 25 +2024-05-26 11:40:21,033 - INFO - [204800/569465] Loss : 0.4348 +2024-05-26 11:40:44,535 - INFO - [266240/569465] Loss : 0.4367 +2024-05-26 11:41:20,002 - INFO - Test loss : 0.7585 +2024-05-26 11:41:20,002 - INFO - +Epoch 26 +2024-05-26 11:42:32,127 - INFO - [204800/569465] Loss : 0.4336 +2024-05-26 11:42:54,824 - INFO - [266240/569465] Loss : 0.4350 +2024-05-26 11:43:25,019 - INFO - Test loss : 0.7718 +2024-05-26 11:43:25,019 - INFO - +Epoch 27 +2024-05-26 11:44:37,238 - INFO - [204800/569465] Loss : 0.4329 +2024-05-26 11:44:57,812 - INFO - [266240/569465] Loss : 0.4339 +2024-05-26 11:45:28,988 - INFO - Test loss : 0.7379 +2024-05-26 11:45:28,988 - INFO - +Epoch 28 +2024-05-26 11:47:00,109 - INFO - [204800/569465] Loss : 0.4296 +2024-05-26 11:47:23,065 - INFO - [266240/569465] Loss : 0.4309 +2024-05-26 11:47:54,315 - INFO - Test loss : 0.7771 +2024-05-26 11:47:54,315 - INFO - +Epoch 29 +2024-05-26 11:49:11,183 - INFO - [204800/569465] Loss : 0.4283 +2024-05-26 11:49:34,056 - INFO - [266240/569465] Loss : 0.4293 +2024-05-26 11:49:44,644 - INFO - Test loss : 0.7846 +2024-05-26 11:49:44,645 - INFO - +Epoch 30 +2024-05-26 11:51:03,096 - INFO - [204800/569465] Loss : 0.4247 +2024-05-26 11:51:27,311 - INFO - [266240/569465] Loss : 0.4261 +2024-05-26 11:51:54,853 - INFO - Test loss : 0.7739 +2024-05-26 11:51:58,494 - INFO - (93622,) +2024-05-26 11:52:54,510 - INFO - Split ID: 0 +2024-05-26 11:52:54,566 - INFO - Top 1 LocEnc (Epoch 30)acc (%): 1.21 +2024-05-26 11:52:54,571 - INFO - Top 3 LocEnc (Epoch 30)acc (%): 3.07 +2024-05-26 11:52:54,576 - INFO - Top 5 LocEnc (Epoch 30)acc (%): 4.65 +2024-05-26 11:52:54,581 - INFO - Top 10 LocEnc (Epoch 30)acc (%): 8.05 +2024-05-26 11:52:54,909 - INFO - +No prior +2024-05-26 11:52:55,600 - INFO - (95986,) +2024-05-26 11:53:19,013 - INFO - Split ID: 0 +2024-05-26 11:53:19,014 - INFO - Top 1 (Epoch 30)acc (%): 63.27 +2024-05-26 11:53:19,014 - INFO - Top 3 (Epoch 30)acc (%): 79.82 +2024-05-26 11:53:19,014 - INFO - Top 5 (Epoch 30)acc (%): 84.51 +2024-05-26 11:53:19,014 - INFO - Top 10 (Epoch 30)acc (%): 88.99 +2024-05-26 11:54:36,451 - INFO - Split ID: 0 +2024-05-26 11:54:36,451 - INFO - Top 1 (Epoch 30)acc (%): 69.1 +2024-05-26 11:54:36,451 - INFO - Top 3 (Epoch 30)acc (%): 84.08 +2024-05-26 11:54:36,452 - INFO - Top 5 (Epoch 30)acc (%): 87.87 +2024-05-26 11:54:36,452 - INFO - Top 10 (Epoch 30)acc (%): 91.49 +2024-05-26 11:54:36,489 - INFO - +Epoch 31 +2024-05-26 11:55:52,739 - INFO - [204800/569465] Loss : 0.4232 +2024-05-26 11:56:13,511 - INFO - [266240/569465] Loss : 0.4244 +2024-05-26 11:56:47,567 - INFO - Test loss : 0.7945 +2024-05-26 11:56:47,567 - INFO - +Epoch 32 +2024-05-26 11:58:03,768 - INFO - [204800/569465] Loss : 0.4216 +2024-05-26 11:58:13,448 - INFO - [266240/569465] Loss : 0.4221 +2024-05-26 11:58:38,867 - INFO - Test loss : 0.8227 +2024-05-26 11:58:38,868 - INFO - +Epoch 33 +2024-05-26 11:59:52,899 - INFO - [204800/569465] Loss : 0.4197 +2024-05-26 12:00:16,740 - INFO - [266240/569465] Loss : 0.4207 +2024-05-26 12:00:49,837 - INFO - Test loss : 0.8285 +2024-05-26 12:00:49,837 - INFO - +Epoch 34 +2024-05-26 12:02:05,313 - INFO - [204800/569465] Loss : 0.4184 +2024-05-26 12:02:27,994 - INFO - [266240/569465] Loss : 0.4193 +2024-05-26 12:03:00,545 - INFO - Test loss : 0.8514 +2024-05-26 12:03:00,545 - INFO - +Epoch 35 +2024-05-26 12:04:11,323 - INFO - [204800/569465] Loss : 0.4159 +2024-05-26 12:04:34,798 - INFO - [266240/569465] Loss : 0.4173 +2024-05-26 12:05:05,132 - INFO - Test loss : 0.8235 +2024-05-26 12:05:05,133 - INFO - +Epoch 36 +2024-05-26 12:06:20,341 - INFO - [204800/569465] Loss : 0.4145 +2024-05-26 12:06:37,426 - INFO - [266240/569465] Loss : 0.4157 +2024-05-26 12:07:11,539 - INFO - Test loss : 0.8296 +2024-05-26 12:07:11,540 - INFO - +Epoch 37 +2024-05-26 12:08:25,253 - INFO - [204800/569465] Loss : 0.4128 +2024-05-26 12:08:46,104 - INFO - [266240/569465] Loss : 0.4140 +2024-05-26 12:09:16,492 - INFO - Test loss : 0.8486 +2024-05-26 12:09:16,492 - INFO - +Epoch 38 +2024-05-26 12:10:36,239 - INFO - [204800/569465] Loss : 0.4111 +2024-05-26 12:10:52,767 - INFO - [266240/569465] Loss : 0.4127 +2024-05-26 12:11:25,310 - INFO - Test loss : 0.8520 +2024-05-26 12:11:25,311 - INFO - +Epoch 39 +2024-05-26 12:12:27,745 - INFO - [204800/569465] Loss : 0.4109 +2024-05-26 12:12:50,935 - INFO - [266240/569465] Loss : 0.4116 +2024-05-26 12:13:25,557 - INFO - Test loss : 0.8784 +2024-05-26 12:13:25,557 - INFO - +Epoch 40 +2024-05-26 12:14:45,915 - INFO - [204800/569465] Loss : 0.4086 +2024-05-26 12:15:07,673 - INFO - [266240/569465] Loss : 0.4095 +2024-05-26 12:15:39,846 - INFO - Test loss : 0.8613 +2024-05-26 12:15:43,719 - INFO - (93622,) +2024-05-26 12:16:40,778 - INFO - Split ID: 0 +2024-05-26 12:16:40,836 - INFO - Top 1 LocEnc (Epoch 40)acc (%): 1.22 +2024-05-26 12:16:40,841 - INFO - Top 3 LocEnc (Epoch 40)acc (%): 3.06 +2024-05-26 12:16:40,845 - INFO - Top 5 LocEnc (Epoch 40)acc (%): 4.59 +2024-05-26 12:16:40,850 - INFO - Top 10 LocEnc (Epoch 40)acc (%): 8.11 +2024-05-26 12:16:41,173 - INFO - +No prior +2024-05-26 12:16:41,878 - INFO - (95986,) +2024-05-26 12:17:05,534 - INFO - Split ID: 0 +2024-05-26 12:17:05,534 - INFO - Top 1 (Epoch 40)acc (%): 63.27 +2024-05-26 12:17:05,534 - INFO - Top 3 (Epoch 40)acc (%): 79.82 +2024-05-26 12:17:05,535 - INFO - Top 5 (Epoch 40)acc (%): 84.51 +2024-05-26 12:17:05,535 - INFO - Top 10 (Epoch 40)acc (%): 88.99 +2024-05-26 12:18:23,546 - INFO - Split ID: 0 +2024-05-26 12:18:23,546 - INFO - Top 1 (Epoch 40)acc (%): 68.89 +2024-05-26 12:18:23,546 - INFO - Top 3 (Epoch 40)acc (%): 83.83 +2024-05-26 12:18:23,547 - INFO - Top 5 (Epoch 40)acc (%): 87.7 +2024-05-26 12:18:23,547 - INFO - Top 10 (Epoch 40)acc (%): 91.33 +2024-05-26 12:18:23,549 - INFO - +Epoch 41 +2024-05-26 12:19:39,963 - INFO - [204800/569465] Loss : 0.4066 +2024-05-26 12:19:57,446 - INFO - [266240/569465] Loss : 0.4079 +2024-05-26 12:20:30,946 - INFO - Test loss : 0.8620 +2024-05-26 12:20:30,946 - INFO - +Epoch 42 +2024-05-26 12:21:37,026 - INFO - [204800/569465] Loss : 0.4077 +2024-05-26 12:21:58,878 - INFO - [266240/569465] Loss : 0.4083 +2024-05-26 12:22:30,973 - INFO - Test loss : 0.8653 +2024-05-26 12:22:30,973 - INFO - +Epoch 43 +2024-05-26 12:23:46,797 - INFO - [204800/569465] Loss : 0.4055 +2024-05-26 12:24:06,996 - INFO - [266240/569465] Loss : 0.4063 +2024-05-26 12:24:32,709 - INFO - Test loss : 0.9154 +2024-05-26 12:24:32,709 - INFO - +Epoch 44 +2024-05-26 12:25:49,653 - INFO - [204800/569465] Loss : 0.4043 +2024-05-26 12:26:09,383 - INFO - [266240/569465] Loss : 0.4050 +2024-05-26 12:26:37,428 - INFO - Test loss : 0.8978 +2024-05-26 12:26:37,428 - INFO - +Epoch 45 +2024-05-26 12:27:49,556 - INFO - [204800/569465] Loss : 0.4027 +2024-05-26 12:28:10,532 - INFO - [266240/569465] Loss : 0.4038 +2024-05-26 12:28:42,737 - INFO - Test loss : 0.8688 +2024-05-26 12:28:42,737 - INFO - +Epoch 46 +2024-05-26 12:29:55,748 - INFO - [204800/569465] Loss : 0.4021 +2024-05-26 12:30:17,804 - INFO - [266240/569465] Loss : 0.4027 +2024-05-26 12:30:50,028 - INFO - Test loss : 0.9131 +2024-05-26 12:30:50,029 - INFO - +Epoch 47 +2024-05-26 12:32:02,448 - INFO - [204800/569465] Loss : 0.4006 +2024-05-26 12:32:26,262 - INFO - [266240/569465] Loss : 0.4016 +2024-05-26 12:32:56,165 - INFO - Test loss : 0.8744 +2024-05-26 12:32:56,165 - INFO - +Epoch 48 +2024-05-26 12:34:12,087 - INFO - [204800/569465] Loss : 0.3996 +2024-05-26 12:34:33,627 - INFO - [266240/569465] Loss : 0.4003 +2024-05-26 12:34:47,429 - INFO - Test loss : 0.9192 +2024-05-26 12:34:47,429 - INFO - +Epoch 49 +2024-05-26 12:36:00,920 - INFO - [204800/569465] Loss : 0.3980 +2024-05-26 12:36:20,533 - INFO - [266240/569465] Loss : 0.3989 +2024-05-26 12:36:51,121 - INFO - Test loss : 0.9346 +2024-05-26 12:36:51,121 - INFO - +Epoch 50 +2024-05-26 12:38:08,587 - INFO - [204800/569465] Loss : 0.3968 +2024-05-26 12:38:29,419 - INFO - [266240/569465] Loss : 0.3977 +2024-05-26 12:38:56,925 - INFO - Test loss : 0.9194 +2024-05-26 12:39:00,567 - INFO - (93622,) +2024-05-26 12:39:56,846 - INFO - Split ID: 0 +2024-05-26 12:39:56,908 - INFO - Top 1 LocEnc (Epoch 50)acc (%): 1.26 +2024-05-26 12:39:56,913 - INFO - Top 3 LocEnc (Epoch 50)acc (%): 3.29 +2024-05-26 12:39:56,918 - INFO - Top 5 LocEnc (Epoch 50)acc (%): 4.87 +2024-05-26 12:39:56,923 - INFO - Top 10 LocEnc (Epoch 50)acc (%): 8.35 +2024-05-26 12:39:57,221 - INFO - +No prior +2024-05-26 12:39:57,894 - INFO - (95986,) +2024-05-26 12:40:20,809 - INFO - Split ID: 0 +2024-05-26 12:40:20,810 - INFO - Top 1 (Epoch 50)acc (%): 63.27 +2024-05-26 12:40:20,810 - INFO - Top 3 (Epoch 50)acc (%): 79.82 +2024-05-26 12:40:20,811 - INFO - Top 5 (Epoch 50)acc (%): 84.51 +2024-05-26 12:40:20,811 - INFO - Top 10 (Epoch 50)acc (%): 88.99 +2024-05-26 12:41:39,044 - INFO - Split ID: 0 +2024-05-26 12:41:39,044 - INFO - Top 1 (Epoch 50)acc (%): 68.79 +2024-05-26 12:41:39,044 - INFO - Top 3 (Epoch 50)acc (%): 83.7 +2024-05-26 12:41:39,045 - INFO - Top 5 (Epoch 50)acc (%): 87.56 +2024-05-26 12:41:39,045 - INFO - Top 10 (Epoch 50)acc (%): 91.21 +2024-05-26 12:41:39,078 - INFO - +Epoch 51 +2024-05-26 12:42:49,152 - INFO - [204800/569465] Loss : 0.3956 +2024-05-26 12:43:05,574 - INFO - [266240/569465] Loss : 0.3965 +2024-05-26 12:43:22,275 - INFO - Test loss : 0.9476 +2024-05-26 12:43:22,276 - INFO - +Epoch 52 +2024-05-26 12:44:40,037 - INFO - [204800/569465] Loss : 0.3944 +2024-05-26 12:44:58,572 - INFO - [266240/569465] Loss : 0.3954 +2024-05-26 12:45:30,837 - INFO - Test loss : 0.9188 +2024-05-26 12:45:30,837 - INFO - +Epoch 53 +2024-05-26 12:46:43,099 - INFO - [204800/569465] Loss : 0.3946 +2024-05-26 12:47:05,292 - INFO - [266240/569465] Loss : 0.3951 +2024-05-26 12:47:34,250 - INFO - Test loss : 0.9156 +2024-05-26 12:47:34,251 - INFO - +Epoch 54 +2024-05-26 12:48:52,249 - INFO - [204800/569465] Loss : 0.3932 +2024-05-26 12:49:14,305 - INFO - [266240/569465] Loss : 0.3938 +2024-05-26 12:49:48,954 - INFO - Test loss : 0.9369 +2024-05-26 12:49:48,955 - INFO - +Epoch 55 +2024-05-26 12:51:05,167 - INFO - [204800/569465] Loss : 0.3930 +2024-05-26 12:51:22,895 - INFO - [266240/569465] Loss : 0.3937 +2024-05-26 12:51:55,218 - INFO - Test loss : 0.9726 +2024-05-26 12:51:55,219 - INFO - +Epoch 56 +2024-05-26 12:53:13,784 - INFO - [204800/569465] Loss : 0.3909 +2024-05-26 12:53:34,039 - INFO - [266240/569465] Loss : 0.3915 +2024-05-26 12:54:01,337 - INFO - Test loss : 0.9569 +2024-05-26 12:54:01,337 - INFO - +Epoch 57 +2024-05-26 12:55:10,189 - INFO - [204800/569465] Loss : 0.3893 +2024-05-26 12:55:32,397 - INFO - [266240/569465] Loss : 0.3907 +2024-05-26 12:56:00,981 - INFO - Test loss : 0.9518 +2024-05-26 12:56:00,982 - INFO - +Epoch 58 +2024-05-26 12:57:01,925 - INFO - [204800/569465] Loss : 0.3901 +2024-05-26 12:57:22,286 - INFO - [266240/569465] Loss : 0.3909 +2024-05-26 12:57:53,639 - INFO - Test loss : 0.9654 +2024-05-26 12:57:53,640 - INFO - +Epoch 59 +2024-05-26 12:59:10,814 - INFO - [204800/569465] Loss : 0.3897 +2024-05-26 12:59:32,976 - INFO - [266240/569465] Loss : 0.3898 +2024-05-26 13:00:02,905 - INFO - Test loss : 0.9592 +2024-05-26 13:00:02,905 - INFO - +Epoch 60 +2024-05-26 13:01:19,426 - INFO - [204800/569465] Loss : 0.3893 +2024-05-26 13:01:39,839 - INFO - [266240/569465] Loss : 0.3897 +2024-05-26 13:02:17,158 - INFO - Test loss : 0.9581 +2024-05-26 13:02:20,864 - INFO - (93622,) +2024-05-26 13:03:43,929 - INFO - Split ID: 0 +2024-05-26 13:03:43,987 - INFO - Top 1 LocEnc (Epoch 60)acc (%): 1.33 +2024-05-26 13:03:43,992 - INFO - Top 3 LocEnc (Epoch 60)acc (%): 3.27 +2024-05-26 13:03:43,997 - INFO - Top 5 LocEnc (Epoch 60)acc (%): 4.99 +2024-05-26 13:03:44,001 - INFO - Top 10 LocEnc (Epoch 60)acc (%): 8.39 +2024-05-26 13:03:44,305 - INFO - +No prior +2024-05-26 13:03:44,998 - INFO - (95986,) +2024-05-26 13:04:08,760 - INFO - Split ID: 0 +2024-05-26 13:04:08,761 - INFO - Top 1 (Epoch 60)acc (%): 63.27 +2024-05-26 13:04:08,761 - INFO - Top 3 (Epoch 60)acc (%): 79.82 +2024-05-26 13:04:08,761 - INFO - Top 5 (Epoch 60)acc (%): 84.51 +2024-05-26 13:04:08,761 - INFO - Top 10 (Epoch 60)acc (%): 88.99 +2024-05-26 13:05:22,694 - INFO - Split ID: 0 +2024-05-26 13:05:22,695 - INFO - Top 1 (Epoch 60)acc (%): 68.59 +2024-05-26 13:05:22,695 - INFO - Top 3 (Epoch 60)acc (%): 83.57 +2024-05-26 13:05:22,695 - INFO - Top 5 (Epoch 60)acc (%): 87.44 +2024-05-26 13:05:22,695 - INFO - Top 10 (Epoch 60)acc (%): 91.1 +2024-05-26 13:05:22,732 - INFO - +Epoch 61 +2024-05-26 13:06:42,831 - INFO - [204800/569465] Loss : 0.3858 +2024-05-26 13:07:00,537 - INFO - [266240/569465] Loss : 0.3870 +2024-05-26 13:07:29,722 - INFO - Test loss : 0.9717 +2024-05-26 13:07:29,723 - INFO - +Epoch 62 +2024-05-26 13:08:48,369 - INFO - [204800/569465] Loss : 0.3870 +2024-05-26 13:09:11,272 - INFO - [266240/569465] Loss : 0.3871 +2024-05-26 13:09:42,164 - INFO - Test loss : 0.9908 +2024-05-26 13:09:42,164 - INFO - +Epoch 63 +2024-05-26 13:10:59,952 - INFO - [204800/569465] Loss : 0.3852 +2024-05-26 13:11:22,135 - INFO - [266240/569465] Loss : 0.3856 +2024-05-26 13:11:54,487 - INFO - Test loss : 1.0096 +2024-05-26 13:11:54,488 - INFO - +Epoch 64 +2024-05-26 13:13:07,698 - INFO - [204800/569465] Loss : 0.3846 +2024-05-26 13:13:31,596 - INFO - [266240/569465] Loss : 0.3856 +2024-05-26 13:14:02,071 - INFO - Test loss : 0.9782 +2024-05-26 13:14:02,071 - INFO - +Epoch 65 +2024-05-26 13:15:19,129 - INFO - [204800/569465] Loss : 0.3834 +2024-05-26 13:15:41,970 - INFO - [266240/569465] Loss : 0.3846 +2024-05-26 13:16:11,271 - INFO - Test loss : 0.9957 +2024-05-26 13:16:11,272 - INFO - +Epoch 66 +2024-05-26 13:17:20,072 - INFO - [204800/569465] Loss : 0.3829 +2024-05-26 13:17:37,430 - INFO - [266240/569465] Loss : 0.3836 +2024-05-26 13:17:56,747 - INFO - Test loss : 1.0046 +2024-05-26 13:17:56,747 - INFO - +Epoch 67 +2024-05-26 13:19:15,910 - INFO - [204800/569465] Loss : 0.3828 +2024-05-26 13:19:36,751 - INFO - [266240/569465] Loss : 0.3834 +2024-05-26 13:20:06,004 - INFO - Test loss : 1.0221 +2024-05-26 13:20:06,005 - INFO - +Epoch 68 +2024-05-26 13:21:16,843 - INFO - [204800/569465] Loss : 0.3825 +2024-05-26 13:21:39,327 - INFO - [266240/569465] Loss : 0.3825 +2024-05-26 13:22:08,127 - INFO - Test loss : 1.0254 +2024-05-26 13:22:08,127 - INFO - +Epoch 69 +2024-05-26 13:23:17,038 - INFO - [204800/569465] Loss : 0.3820 +2024-05-26 13:23:36,379 - INFO - [266240/569465] Loss : 0.3828 +2024-05-26 13:24:03,554 - INFO - Test loss : 1.0358 +2024-05-26 13:24:03,555 - INFO - +Epoch 70 +2024-05-26 13:25:17,109 - INFO - [204800/569465] Loss : 0.3801 +2024-05-26 13:25:36,835 - INFO - [266240/569465] Loss : 0.3809 +2024-05-26 13:26:04,558 - INFO - Test loss : 1.0303 +2024-05-26 13:26:08,427 - INFO - (93622,) +2024-05-26 13:27:04,047 - INFO - Split ID: 0 +2024-05-26 13:27:04,105 - INFO - Top 1 LocEnc (Epoch 70)acc (%): 1.31 +2024-05-26 13:27:04,110 - INFO - Top 3 LocEnc (Epoch 70)acc (%): 3.36 +2024-05-26 13:27:04,115 - INFO - Top 5 LocEnc (Epoch 70)acc (%): 4.98 +2024-05-26 13:27:04,120 - INFO - Top 10 LocEnc (Epoch 70)acc (%): 8.54 +2024-05-26 13:27:04,445 - INFO - +No prior +2024-05-26 13:27:05,143 - INFO - (95986,) +2024-05-26 13:27:28,187 - INFO - Split ID: 0 +2024-05-26 13:27:28,188 - INFO - Top 1 (Epoch 70)acc (%): 63.27 +2024-05-26 13:27:28,188 - INFO - Top 3 (Epoch 70)acc (%): 79.82 +2024-05-26 13:27:28,189 - INFO - Top 5 (Epoch 70)acc (%): 84.51 +2024-05-26 13:27:28,189 - INFO - Top 10 (Epoch 70)acc (%): 88.99 +2024-05-26 13:28:41,893 - INFO - Split ID: 0 +2024-05-26 13:28:41,893 - INFO - Top 1 (Epoch 70)acc (%): 68.46 +2024-05-26 13:28:41,893 - INFO - Top 3 (Epoch 70)acc (%): 83.35 +2024-05-26 13:28:41,894 - INFO - Top 5 (Epoch 70)acc (%): 87.25 +2024-05-26 13:28:41,894 - INFO - Top 10 (Epoch 70)acc (%): 90.96 +2024-05-26 13:28:41,923 - INFO - +Epoch 71 +2024-05-26 13:29:55,904 - INFO - [204800/569465] Loss : 0.3795 +2024-05-26 13:30:14,278 - INFO - [266240/569465] Loss : 0.3799 +2024-05-26 13:30:46,275 - INFO - Test loss : 1.0290 +2024-05-26 13:30:46,275 - INFO - +Epoch 72 +2024-05-26 13:31:56,234 - INFO - [204800/569465] Loss : 0.3790 +2024-05-26 13:32:19,735 - INFO - [266240/569465] Loss : 0.3794 +2024-05-26 13:32:51,243 - INFO - Test loss : 1.0482 +2024-05-26 13:32:51,243 - INFO - +Epoch 73 +2024-05-26 13:34:01,420 - INFO - [204800/569465] Loss : 0.3786 +2024-05-26 13:34:20,280 - INFO - [266240/569465] Loss : 0.3790 +2024-05-26 13:34:52,129 - INFO - Test loss : 1.0509 +2024-05-26 13:34:52,129 - INFO - +Epoch 74 +2024-05-26 13:35:57,924 - INFO - [204800/569465] Loss : 0.3764 +2024-05-26 13:36:20,122 - INFO - [266240/569465] Loss : 0.3772 +2024-05-26 13:36:51,766 - INFO - Test loss : 1.0405 +2024-05-26 13:36:51,766 - INFO - +Epoch 75 +2024-05-26 13:38:07,654 - INFO - [204800/569465] Loss : 0.3777 +2024-05-26 13:38:28,149 - INFO - [266240/569465] Loss : 0.3785 +2024-05-26 13:38:54,998 - INFO - Test loss : 1.0355 +2024-05-26 13:38:54,999 - INFO - +Epoch 76 +2024-05-26 13:40:08,278 - INFO - [204800/569465] Loss : 0.3763 +2024-05-26 13:40:29,361 - INFO - [266240/569465] Loss : 0.3776 +2024-05-26 13:40:55,331 - INFO - Test loss : 1.0109 +2024-05-26 13:40:55,331 - INFO - +Epoch 77 +2024-05-26 13:41:56,846 - INFO - [204800/569465] Loss : 0.3763 +2024-05-26 13:42:16,116 - INFO - [266240/569465] Loss : 0.3768 +2024-05-26 13:42:45,038 - INFO - Test loss : 1.0454 +2024-05-26 13:42:45,038 - INFO - +Epoch 78 +2024-05-26 13:43:54,342 - INFO - [204800/569465] Loss : 0.3757 +2024-05-26 13:44:13,862 - INFO - [266240/569465] Loss : 0.3764 +2024-05-26 13:44:43,512 - INFO - Test loss : 1.0678 +2024-05-26 13:44:43,512 - INFO - +Epoch 79 +2024-05-26 13:45:52,536 - INFO - [204800/569465] Loss : 0.3746 +2024-05-26 13:46:13,704 - INFO - [266240/569465] Loss : 0.3749 +2024-05-26 13:46:43,542 - INFO - Test loss : 1.0401 +2024-05-26 13:46:43,543 - INFO - +Epoch 80 +2024-05-26 13:47:54,027 - INFO - [204800/569465] Loss : 0.3749 +2024-05-26 13:48:20,207 - INFO - [266240/569465] Loss : 0.3754 +2024-05-26 13:48:51,609 - INFO - Test loss : 1.0671 +2024-05-26 13:49:14,538 - INFO - (93622,) +2024-05-26 13:50:11,399 - INFO - Split ID: 0 +2024-05-26 13:50:11,443 - INFO - Top 1 LocEnc (Epoch 80)acc (%): 1.35 +2024-05-26 13:50:11,446 - INFO - Top 3 LocEnc (Epoch 80)acc (%): 3.37 +2024-05-26 13:50:11,449 - INFO - Top 5 LocEnc (Epoch 80)acc (%): 5.07 +2024-05-26 13:50:11,454 - INFO - Top 10 LocEnc (Epoch 80)acc (%): 8.62 +2024-05-26 13:50:11,856 - INFO - +No prior +2024-05-26 13:50:12,556 - INFO - (95986,) +2024-05-26 13:50:34,888 - INFO - Split ID: 0 +2024-05-26 13:50:34,889 - INFO - Top 1 (Epoch 80)acc (%): 63.27 +2024-05-26 13:50:34,889 - INFO - Top 3 (Epoch 80)acc (%): 79.82 +2024-05-26 13:50:34,889 - INFO - Top 5 (Epoch 80)acc (%): 84.51 +2024-05-26 13:50:34,889 - INFO - Top 10 (Epoch 80)acc (%): 88.99 +2024-05-26 13:51:50,793 - INFO - Split ID: 0 +2024-05-26 13:51:50,793 - INFO - Top 1 (Epoch 80)acc (%): 68.4 +2024-05-26 13:51:50,793 - INFO - Top 3 (Epoch 80)acc (%): 83.28 +2024-05-26 13:51:50,794 - INFO - Top 5 (Epoch 80)acc (%): 87.15 +2024-05-26 13:51:50,794 - INFO - Top 10 (Epoch 80)acc (%): 90.85 +2024-05-26 13:51:50,827 - INFO - +Epoch 81 +2024-05-26 13:53:01,312 - INFO - [204800/569465] Loss : 0.3734 +2024-05-26 13:53:20,970 - INFO - [266240/569465] Loss : 0.3744 +2024-05-26 13:53:48,808 - INFO - Test loss : 1.0547 +2024-05-26 13:53:48,808 - INFO - +Epoch 82 +2024-05-26 13:55:02,539 - INFO - [204800/569465] Loss : 0.3728 +2024-05-26 13:55:23,843 - INFO - [266240/569465] Loss : 0.3738 +2024-05-26 13:55:51,895 - INFO - Test loss : 1.0545 +2024-05-26 13:55:51,895 - INFO - +Epoch 83 +2024-05-26 13:57:05,076 - INFO - [204800/569465] Loss : 0.3718 +2024-05-26 13:57:24,099 - INFO - [266240/569465] Loss : 0.3730 +2024-05-26 13:57:50,969 - INFO - Test loss : 1.0435 +2024-05-26 13:57:50,969 - INFO - +Epoch 84 +2024-05-26 13:59:01,136 - INFO - [204800/569465] Loss : 0.3729 +2024-05-26 13:59:19,103 - INFO - [266240/569465] Loss : 0.3731 +2024-05-26 13:59:49,036 - INFO - Test loss : 1.0745 +2024-05-26 13:59:49,037 - INFO - +Epoch 85 +2024-05-26 14:01:01,528 - INFO - [204800/569465] Loss : 0.3727 +2024-05-26 14:01:21,334 - INFO - [266240/569465] Loss : 0.3736 +2024-05-26 14:01:53,050 - INFO - Test loss : 1.0566 +2024-05-26 14:01:53,050 - INFO - +Epoch 86 +2024-05-26 14:02:51,739 - INFO - [204800/569465] Loss : 0.3710 +2024-05-26 14:03:09,452 - INFO - [266240/569465] Loss : 0.3716 +2024-05-26 14:03:36,823 - INFO - Test loss : 1.0834 +2024-05-26 14:03:36,824 - INFO - +Epoch 87 +2024-05-26 14:04:52,209 - INFO - [204800/569465] Loss : 0.3702 +2024-05-26 14:05:12,896 - INFO - [266240/569465] Loss : 0.3711 +2024-05-26 14:05:47,409 - INFO - Test loss : 1.0979 +2024-05-26 14:05:47,410 - INFO - +Epoch 88 +2024-05-26 14:07:02,128 - INFO - [204800/569465] Loss : 0.3707 +2024-05-26 14:07:23,181 - INFO - [266240/569465] Loss : 0.3713 +2024-05-26 14:07:58,438 - INFO - Test loss : 1.0727 +2024-05-26 14:07:58,438 - INFO - +Epoch 89 +2024-05-26 14:09:14,244 - INFO - [204800/569465] Loss : 0.3702 +2024-05-26 14:09:34,821 - INFO - [266240/569465] Loss : 0.3708 +2024-05-26 14:10:08,898 - INFO - Test loss : 1.0963 +2024-05-26 14:10:08,898 - INFO - +Epoch 90 +2024-05-26 14:11:26,997 - INFO - [204800/569465] Loss : 0.3695 +2024-05-26 14:11:49,046 - INFO - [266240/569465] Loss : 0.3701 +2024-05-26 14:12:23,655 - INFO - Test loss : 1.0931 +2024-05-26 14:12:26,256 - INFO - (93622,) +2024-05-26 14:13:22,501 - INFO - Split ID: 0 +2024-05-26 14:13:22,541 - INFO - Top 1 LocEnc (Epoch 90)acc (%): 1.38 +2024-05-26 14:13:22,544 - INFO - Top 3 LocEnc (Epoch 90)acc (%): 3.41 +2024-05-26 14:13:22,547 - INFO - Top 5 LocEnc (Epoch 90)acc (%): 5.1 +2024-05-26 14:13:22,551 - INFO - Top 10 LocEnc (Epoch 90)acc (%): 8.63 +2024-05-26 14:13:22,833 - INFO - +No prior +2024-05-26 14:13:23,429 - INFO - (95986,) +2024-05-26 14:13:44,940 - INFO - Split ID: 0 +2024-05-26 14:13:44,940 - INFO - Top 1 (Epoch 90)acc (%): 63.27 +2024-05-26 14:13:44,941 - INFO - Top 3 (Epoch 90)acc (%): 79.82 +2024-05-26 14:13:44,941 - INFO - Top 5 (Epoch 90)acc (%): 84.51 +2024-05-26 14:13:44,941 - INFO - Top 10 (Epoch 90)acc (%): 88.99 +2024-05-26 14:15:02,084 - INFO - Split ID: 0 +2024-05-26 14:15:02,085 - INFO - Top 1 (Epoch 90)acc (%): 68.31 +2024-05-26 14:15:02,085 - INFO - Top 3 (Epoch 90)acc (%): 83.2 +2024-05-26 14:15:02,085 - INFO - Top 5 (Epoch 90)acc (%): 87.1 +2024-05-26 14:15:02,085 - INFO - Top 10 (Epoch 90)acc (%): 90.8 +2024-05-26 14:15:02,087 - INFO - +Epoch 91 +2024-05-26 14:16:19,963 - INFO - [204800/569465] Loss : 0.3682 +2024-05-26 14:16:40,320 - INFO - [266240/569465] Loss : 0.3694 +2024-05-26 14:17:15,045 - INFO - Test loss : 1.0954 +2024-05-26 14:17:15,045 - INFO - +Epoch 92 +2024-05-26 14:18:33,525 - INFO - [204800/569465] Loss : 0.3699 +2024-05-26 14:18:54,615 - INFO - [266240/569465] Loss : 0.3702 +2024-05-26 14:19:26,797 - INFO - Test loss : 1.1040 +2024-05-26 14:19:26,797 - INFO - +Epoch 93 +2024-05-26 14:20:43,651 - INFO - [204800/569465] Loss : 0.3675 +2024-05-26 14:21:06,634 - INFO - [266240/569465] Loss : 0.3687 +2024-05-26 14:21:39,202 - INFO - Test loss : 1.0894 +2024-05-26 14:21:39,202 - INFO - +Epoch 94 +2024-05-26 14:22:59,644 - INFO - [204800/569465] Loss : 0.3676 +2024-05-26 14:23:22,105 - INFO - [266240/569465] Loss : 0.3680 +2024-05-26 14:23:54,186 - INFO - Test loss : 1.1030 +2024-05-26 14:23:54,186 - INFO - +Epoch 95 +2024-05-26 14:25:12,652 - INFO - [204800/569465] Loss : 0.3678 +2024-05-26 14:25:35,402 - INFO - [266240/569465] Loss : 0.3682 +2024-05-26 14:25:47,574 - INFO - Test loss : 1.1011 +2024-05-26 14:25:47,574 - INFO - +Epoch 96 +2024-05-26 14:27:02,709 - INFO - [204800/569465] Loss : 0.3670 +2024-05-26 14:27:25,707 - INFO - [266240/569465] Loss : 0.3675 +2024-05-26 14:27:57,883 - INFO - Test loss : 1.1310 +2024-05-26 14:27:57,883 - INFO - +Epoch 97 +2024-05-26 14:29:13,930 - INFO - [204800/569465] Loss : 0.3668 +2024-05-26 14:29:35,050 - INFO - [266240/569465] Loss : 0.3668 +2024-05-26 14:30:06,908 - INFO - Test loss : 1.1422 +2024-05-26 14:30:06,909 - INFO - +Epoch 98 +2024-05-26 14:31:23,621 - INFO - [204800/569465] Loss : 0.3658 +2024-05-26 14:31:45,700 - INFO - [266240/569465] Loss : 0.3668 +2024-05-26 14:32:19,632 - INFO - Test loss : 1.1048 +2024-05-26 14:32:19,632 - INFO - +Epoch 99 +2024-05-26 14:33:35,257 - INFO - [204800/569465] Loss : 0.3656 +2024-05-26 14:33:59,152 - INFO - [266240/569465] Loss : 0.3663 +2024-05-26 14:34:29,378 - INFO - Test loss : 1.1180 +2024-05-26 14:34:29,379 - INFO - Saving output model to ../models/space2vec_theory/model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-26 14:34:29,448 - INFO - Saving output model to ../models/space2vec_theory/model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-26 14:34:29,615 - INFO - +No prior +2024-05-26 14:34:30,298 - INFO - (95986,) +2024-05-26 14:34:51,900 - INFO - Split ID: 0 +2024-05-26 14:34:51,901 - INFO - Top 1 acc (%): 63.27 +2024-05-26 14:34:51,901 - INFO - Top 3 acc (%): 79.82 +2024-05-26 14:34:51,902 - INFO - Top 5 acc (%): 84.51 +2024-05-26 14:34:51,902 - INFO - Top 10 acc (%): 88.99 +2024-05-26 14:36:08,245 - INFO - Split ID: 0 +2024-05-26 14:36:08,245 - INFO - Top 1 acc (%): 68.3 +2024-05-26 14:36:08,245 - INFO - Top 3 acc (%): 83.16 +2024-05-26 14:36:08,246 - INFO - Top 5 acc (%): 87.04 +2024-05-26 14:36:08,246 - INFO - Top 10 acc (%): 90.75 +2024-05-26 14:36:08,274 - INFO - +Space2Vec-theory +2024-05-26 14:36:08,274 - INFO - Model : model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-26 14:36:11,712 - INFO - (93622,) +2024-05-26 14:37:05,866 - INFO - Split ID: 0 +2024-05-26 14:37:05,919 - INFO - Top 1 LocEnc acc (%): 1.41 +2024-05-26 14:37:05,922 - INFO - Top 3 LocEnc acc (%): 3.46 +2024-05-26 14:37:05,926 - INFO - Top 5 LocEnc acc (%): 5.23 +2024-05-26 14:37:05,931 - INFO - Top 10 LocEnc acc (%): 8.81 +2024-05-31 01:58:48,099 - INFO - +num_classes 5089 +2024-05-31 01:58:48,099 - INFO - num train 569465 +2024-05-31 01:58:48,099 - INFO - num val 93622 +2024-05-31 01:58:48,099 - INFO - train loss full_loss +2024-05-31 01:58:48,099 - INFO - model name ../models/space2vec_theory/model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:58:48,099 - INFO - num users 17302 +2024-05-31 01:58:48,979 - INFO - +Only Space2Vec-theory +2024-05-31 01:58:48,979 - INFO - Model : model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:58:49,239 - INFO - Saving output model to ../models/space2vec_theory/model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:58:49,416 - INFO - Saving output model to ../models/space2vec_theory/model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:58:49,608 - INFO - +No prior +2024-05-31 01:58:50,259 - INFO - (95986,) +2024-05-31 01:59:21,696 - INFO - Save results to ../eval_results/eval_inat_2017__val_no_prior.csv +2024-05-31 01:59:21,697 - INFO - Split ID: 0 +2024-05-31 01:59:21,697 - INFO - Top 1 acc (%): 63.27 +2024-05-31 01:59:21,697 - INFO - Top 3 acc (%): 79.82 +2024-05-31 01:59:21,697 - INFO - Top 5 acc (%): 84.51 +2024-05-31 01:59:21,697 - INFO - Top 10 acc (%): 88.99 +2024-05-31 02:00:49,976 - INFO - Split ID: 0 +2024-05-31 02:00:49,977 - INFO - Top 1 hit (%): 68.3 +2024-05-31 02:00:49,977 - INFO - Top 3 hit (%): 83.16 +2024-05-31 02:00:49,977 - INFO - Top 5 hit (%): 87.04 +2024-05-31 02:00:49,977 - INFO - Top 10 hit (%): 90.75 +2024-05-31 02:00:50,022 - INFO - +Only Space2Vec-theory +2024-05-31 02:00:50,022 - INFO - Model : model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 02:00:52,131 - INFO - (93622,) +2024-05-31 02:01:38,448 - INFO - Split ID: 0 +2024-05-31 02:01:38,480 - INFO - Top 1 LocEnc acc (%): 1.41 +2024-05-31 02:01:38,483 - INFO - Top 3 LocEnc acc (%): 3.46 +2024-05-31 02:01:38,486 - INFO - Top 5 LocEnc acc (%): 5.23 +2024-05-31 02:01:38,489 - INFO - Top 10 LocEnc acc (%): 8.81 diff --git a/pre_trained_models/space2vec_theory/model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/space2vec_theory/model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..163a1baf Binary files /dev/null and b/pre_trained_models/space2vec_theory/model_inat_2017_Space2Vec-theory_inception_v3_0.0200_64_0.1000000_360.000_1_256_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_theory/model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.log b/pre_trained_models/space2vec_theory/model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..ed4bf158 --- /dev/null +++ b/pre_trained_models/space2vec_theory/model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.log @@ -0,0 +1,737 @@ +2024-05-25 13:46:49,914 - INFO - +num_classes 8142 +2024-05-25 13:46:49,914 - INFO - num train 436063 +2024-05-25 13:46:49,914 - INFO - num val 24343 +2024-05-25 13:46:49,914 - INFO - train loss full_loss +2024-05-25 13:46:49,914 - INFO - model name ../models/space2vec_theory/model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-25 13:46:49,914 - INFO - num users 18643 +2024-05-25 13:46:50,816 - INFO - +Epoch 0 +2024-05-25 13:47:24,357 - INFO - [204800/436063] Loss : 1.6830 +2024-05-25 13:47:32,929 - INFO - [262144/436063] Loss : 1.4991 +2024-05-25 13:47:36,399 - INFO - Test loss : 0.2698 +2024-05-25 13:47:36,399 - INFO - +Epoch 1 +2024-05-25 13:48:10,666 - INFO - [204800/436063] Loss : 0.6526 +2024-05-25 13:48:19,416 - INFO - [262144/436063] Loss : 0.6402 +2024-05-25 13:48:23,112 - INFO - Test loss : 0.1556 +2024-05-25 13:48:23,112 - INFO - +Epoch 2 +2024-05-25 13:48:56,707 - INFO - [204800/436063] Loss : 0.5398 +2024-05-25 13:49:05,164 - INFO - [262144/436063] Loss : 0.5358 +2024-05-25 13:49:08,727 - INFO - Test loss : 0.1479 +2024-05-25 13:49:08,727 - INFO - +Epoch 3 +2024-05-25 13:49:41,622 - INFO - [204800/436063] Loss : 0.4860 +2024-05-25 13:49:49,839 - INFO - [262144/436063] Loss : 0.4837 +2024-05-25 13:49:53,159 - INFO - Test loss : 0.1327 +2024-05-25 13:49:53,160 - INFO - +Epoch 4 +2024-05-25 13:50:26,408 - INFO - [204800/436063] Loss : 0.4521 +2024-05-25 13:50:35,059 - INFO - [262144/436063] Loss : 0.4515 +2024-05-25 13:50:38,828 - INFO - Test loss : 0.1361 +2024-05-25 13:50:38,828 - INFO - +Epoch 5 +2024-05-25 13:51:14,113 - INFO - [204800/436063] Loss : 0.4307 +2024-05-25 13:51:22,723 - INFO - [262144/436063] Loss : 0.4303 +2024-05-25 13:51:26,205 - INFO - Test loss : 0.1489 +2024-05-25 13:51:26,206 - INFO - +Epoch 6 +2024-05-25 13:51:58,102 - INFO - [204800/436063] Loss : 0.4122 +2024-05-25 13:52:06,755 - INFO - [262144/436063] Loss : 0.4124 +2024-05-25 13:52:10,383 - INFO - Test loss : 0.1514 +2024-05-25 13:52:10,383 - INFO - +Epoch 7 +2024-05-25 13:52:43,023 - INFO - [204800/436063] Loss : 0.3964 +2024-05-25 13:52:51,837 - INFO - [262144/436063] Loss : 0.3975 +2024-05-25 13:52:55,352 - INFO - Test loss : 0.1469 +2024-05-25 13:52:55,353 - INFO - +Epoch 8 +2024-05-25 13:53:30,343 - INFO - [204800/436063] Loss : 0.3861 +2024-05-25 13:53:39,256 - INFO - [262144/436063] Loss : 0.3870 +2024-05-25 13:53:42,906 - INFO - Test loss : 0.1578 +2024-05-25 13:53:42,906 - INFO - +Epoch 9 +2024-05-25 13:54:18,026 - INFO - [204800/436063] Loss : 0.3752 +2024-05-25 13:54:27,456 - INFO - [262144/436063] Loss : 0.3766 +2024-05-25 13:54:30,899 - INFO - Test loss : 0.1752 +2024-05-25 13:54:30,899 - INFO - +Epoch 10 +2024-05-25 13:55:04,518 - INFO - [204800/436063] Loss : 0.3682 +2024-05-25 13:55:12,509 - INFO - [262144/436063] Loss : 0.3691 +2024-05-25 13:55:15,967 - INFO - Test loss : 0.1782 +2024-05-25 13:55:16,730 - INFO - (24343,) +2024-05-25 13:55:42,133 - INFO - Split ID: 0 +2024-05-25 13:55:42,142 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 1.61 +2024-05-25 13:55:42,143 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 4.3 +2024-05-25 13:55:42,143 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 6.79 +2024-05-25 13:55:42,144 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 12.18 +2024-05-25 13:55:42,252 - INFO - +No prior +2024-05-25 13:55:42,494 - INFO - (24426,) +2024-05-25 13:55:53,504 - INFO - Split ID: 0 +2024-05-25 13:55:53,504 - INFO - Top 1 (Epoch 10)acc (%): 60.2 +2024-05-25 13:55:53,504 - INFO - Top 3 (Epoch 10)acc (%): 77.9 +2024-05-25 13:55:53,504 - INFO - Top 5 (Epoch 10)acc (%): 83.29 +2024-05-25 13:55:53,504 - INFO - Top 10 (Epoch 10)acc (%): 88.5 +2024-05-25 13:56:16,339 - INFO - Split ID: 0 +2024-05-25 13:56:16,339 - INFO - Top 1 (Epoch 10)acc (%): 72.24 +2024-05-25 13:56:16,339 - INFO - Top 3 (Epoch 10)acc (%): 87.2 +2024-05-25 13:56:16,339 - INFO - Top 5 (Epoch 10)acc (%): 90.72 +2024-05-25 13:56:16,339 - INFO - Top 10 (Epoch 10)acc (%): 93.88 +2024-05-25 13:56:16,340 - INFO - +Epoch 11 +2024-05-25 13:56:50,719 - INFO - [204800/436063] Loss : 0.3588 +2024-05-25 13:56:59,374 - INFO - [262144/436063] Loss : 0.3603 +2024-05-25 13:57:02,645 - INFO - Test loss : 0.1842 +2024-05-25 13:57:02,646 - INFO - +Epoch 12 +2024-05-25 13:57:35,062 - INFO - [204800/436063] Loss : 0.3537 +2024-05-25 13:57:43,286 - INFO - [262144/436063] Loss : 0.3547 +2024-05-25 13:57:46,525 - INFO - Test loss : 0.1814 +2024-05-25 13:57:46,525 - INFO - +Epoch 13 +2024-05-25 13:58:20,527 - INFO - [204800/436063] Loss : 0.3484 +2024-05-25 13:58:29,404 - INFO - [262144/436063] Loss : 0.3494 +2024-05-25 13:58:33,079 - INFO - Test loss : 0.1829 +2024-05-25 13:58:33,080 - INFO - +Epoch 14 +2024-05-25 13:59:05,123 - INFO - [204800/436063] Loss : 0.3441 +2024-05-25 13:59:13,478 - INFO - [262144/436063] Loss : 0.3447 +2024-05-25 13:59:16,889 - INFO - Test loss : 0.1869 +2024-05-25 13:59:16,889 - INFO - +Epoch 15 +2024-05-25 13:59:50,640 - INFO - [204800/436063] Loss : 0.3384 +2024-05-25 13:59:59,393 - INFO - [262144/436063] Loss : 0.3394 +2024-05-25 14:00:02,872 - INFO - Test loss : 0.1967 +2024-05-25 14:00:02,872 - INFO - +Epoch 16 +2024-05-25 14:00:38,168 - INFO - [204800/436063] Loss : 0.3348 +2024-05-25 14:00:50,238 - INFO - [262144/436063] Loss : 0.3361 +2024-05-25 14:00:54,001 - INFO - Test loss : 0.2114 +2024-05-25 14:00:54,001 - INFO - +Epoch 17 +2024-05-25 14:01:30,426 - INFO - [204800/436063] Loss : 0.3291 +2024-05-25 14:01:39,878 - INFO - [262144/436063] Loss : 0.3305 +2024-05-25 14:01:43,625 - INFO - Test loss : 0.2232 +2024-05-25 14:01:43,625 - INFO - +Epoch 18 +2024-05-25 14:02:19,537 - INFO - [204800/436063] Loss : 0.3261 +2024-05-25 14:02:27,795 - INFO - [262144/436063] Loss : 0.3273 +2024-05-25 14:02:30,882 - INFO - Test loss : 0.2113 +2024-05-25 14:02:30,883 - INFO - +Epoch 19 +2024-05-25 14:03:02,330 - INFO - [204800/436063] Loss : 0.3221 +2024-05-25 14:03:10,269 - INFO - [262144/436063] Loss : 0.3236 +2024-05-25 14:03:13,411 - INFO - Test loss : 0.2260 +2024-05-25 14:03:13,411 - INFO - +Epoch 20 +2024-05-25 14:03:43,480 - INFO - [204800/436063] Loss : 0.3201 +2024-05-25 14:03:51,372 - INFO - [262144/436063] Loss : 0.3214 +2024-05-25 14:03:55,091 - INFO - Test loss : 0.2203 +2024-05-25 14:03:55,789 - INFO - (24343,) +2024-05-25 14:04:16,107 - INFO - Split ID: 0 +2024-05-25 14:04:16,116 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 2.8 +2024-05-25 14:04:16,116 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 6.85 +2024-05-25 14:04:16,117 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 10.09 +2024-05-25 14:04:16,118 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 16.96 +2024-05-25 14:04:16,224 - INFO - +No prior +2024-05-25 14:04:16,448 - INFO - (24426,) +2024-05-25 14:04:26,012 - INFO - Split ID: 0 +2024-05-25 14:04:26,013 - INFO - Top 1 (Epoch 20)acc (%): 60.2 +2024-05-25 14:04:26,013 - INFO - Top 3 (Epoch 20)acc (%): 77.9 +2024-05-25 14:04:26,013 - INFO - Top 5 (Epoch 20)acc (%): 83.29 +2024-05-25 14:04:26,013 - INFO - Top 10 (Epoch 20)acc (%): 88.5 +2024-05-25 14:04:49,521 - INFO - Split ID: 0 +2024-05-25 14:04:49,521 - INFO - Top 1 (Epoch 20)acc (%): 72.84 +2024-05-25 14:04:49,521 - INFO - Top 3 (Epoch 20)acc (%): 87.48 +2024-05-25 14:04:49,521 - INFO - Top 5 (Epoch 20)acc (%): 90.84 +2024-05-25 14:04:49,521 - INFO - Top 10 (Epoch 20)acc (%): 94.01 +2024-05-25 14:04:49,522 - INFO - +Epoch 21 +2024-05-25 14:05:22,712 - INFO - [204800/436063] Loss : 0.3167 +2024-05-25 14:05:31,573 - INFO - [262144/436063] Loss : 0.3182 +2024-05-25 14:05:34,755 - INFO - Test loss : 0.2460 +2024-05-25 14:05:34,755 - INFO - +Epoch 22 +2024-05-25 14:06:08,812 - INFO - [204800/436063] Loss : 0.3146 +2024-05-25 14:06:17,529 - INFO - [262144/436063] Loss : 0.3155 +2024-05-25 14:06:21,020 - INFO - Test loss : 0.2313 +2024-05-25 14:06:21,020 - INFO - +Epoch 23 +2024-05-25 14:06:55,790 - INFO - [204800/436063] Loss : 0.3109 +2024-05-25 14:07:04,096 - INFO - [262144/436063] Loss : 0.3127 +2024-05-25 14:07:07,491 - INFO - Test loss : 0.2331 +2024-05-25 14:07:07,492 - INFO - +Epoch 24 +2024-05-25 14:07:41,000 - INFO - [204800/436063] Loss : 0.3108 +2024-05-25 14:07:49,860 - INFO - [262144/436063] Loss : 0.3114 +2024-05-25 14:07:53,371 - INFO - Test loss : 0.2555 +2024-05-25 14:07:53,371 - INFO - +Epoch 25 +2024-05-25 14:08:26,402 - INFO - [204800/436063] Loss : 0.3070 +2024-05-25 14:08:34,839 - INFO - [262144/436063] Loss : 0.3079 +2024-05-25 14:08:38,159 - INFO - Test loss : 0.2531 +2024-05-25 14:08:38,159 - INFO - +Epoch 26 +2024-05-25 14:09:10,065 - INFO - [204800/436063] Loss : 0.3049 +2024-05-25 14:09:18,135 - INFO - [262144/436063] Loss : 0.3060 +2024-05-25 14:09:21,559 - INFO - Test loss : 0.2498 +2024-05-25 14:09:21,560 - INFO - +Epoch 27 +2024-05-25 14:09:56,302 - INFO - [204800/436063] Loss : 0.3026 +2024-05-25 14:10:04,545 - INFO - [262144/436063] Loss : 0.3041 +2024-05-25 14:10:08,047 - INFO - Test loss : 0.2438 +2024-05-25 14:10:08,048 - INFO - +Epoch 28 +2024-05-25 14:10:40,986 - INFO - [204800/436063] Loss : 0.3005 +2024-05-25 14:10:49,100 - INFO - [262144/436063] Loss : 0.3018 +2024-05-25 14:10:52,294 - INFO - Test loss : 0.2505 +2024-05-25 14:10:52,294 - INFO - +Epoch 29 +2024-05-25 14:11:24,212 - INFO - [204800/436063] Loss : 0.2992 +2024-05-25 14:11:32,278 - INFO - [262144/436063] Loss : 0.2997 +2024-05-25 14:11:35,602 - INFO - Test loss : 0.2768 +2024-05-25 14:11:35,603 - INFO - +Epoch 30 +2024-05-25 14:12:07,177 - INFO - [204800/436063] Loss : 0.2967 +2024-05-25 14:12:15,644 - INFO - [262144/436063] Loss : 0.2977 +2024-05-25 14:12:19,276 - INFO - Test loss : 0.2587 +2024-05-25 14:12:19,976 - INFO - (24343,) +2024-05-25 14:12:40,314 - INFO - Split ID: 0 +2024-05-25 14:12:40,323 - INFO - Top 1 LocEnc (Epoch 30)acc (%): 3.1 +2024-05-25 14:12:40,323 - INFO - Top 3 LocEnc (Epoch 30)acc (%): 7.77 +2024-05-25 14:12:40,324 - INFO - Top 5 LocEnc (Epoch 30)acc (%): 11.6 +2024-05-25 14:12:40,325 - INFO - Top 10 LocEnc (Epoch 30)acc (%): 19.37 +2024-05-25 14:12:40,425 - INFO - +No prior +2024-05-25 14:12:40,637 - INFO - (24426,) +2024-05-25 14:12:50,159 - INFO - Split ID: 0 +2024-05-25 14:12:50,160 - INFO - Top 1 (Epoch 30)acc (%): 60.2 +2024-05-25 14:12:50,160 - INFO - Top 3 (Epoch 30)acc (%): 77.9 +2024-05-25 14:12:50,160 - INFO - Top 5 (Epoch 30)acc (%): 83.29 +2024-05-25 14:12:50,160 - INFO - Top 10 (Epoch 30)acc (%): 88.5 +2024-05-25 14:13:14,446 - INFO - Split ID: 0 +2024-05-25 14:13:14,447 - INFO - Top 1 (Epoch 30)acc (%): 73.0 +2024-05-25 14:13:14,447 - INFO - Top 3 (Epoch 30)acc (%): 87.55 +2024-05-25 14:13:14,447 - INFO - Top 5 (Epoch 30)acc (%): 90.93 +2024-05-25 14:13:14,447 - INFO - Top 10 (Epoch 30)acc (%): 94.05 +2024-05-25 14:13:14,447 - INFO - +Epoch 31 +2024-05-25 14:13:53,037 - INFO - [204800/436063] Loss : 0.2955 +2024-05-25 14:14:04,141 - INFO - [262144/436063] Loss : 0.2967 +2024-05-25 14:14:08,257 - INFO - Test loss : 0.2607 +2024-05-25 14:14:08,257 - INFO - +Epoch 32 +2024-05-25 14:14:41,603 - INFO - [204800/436063] Loss : 0.2937 +2024-05-25 14:14:50,237 - INFO - [262144/436063] Loss : 0.2945 +2024-05-25 14:14:53,384 - INFO - Test loss : 0.2871 +2024-05-25 14:14:53,384 - INFO - +Epoch 33 +2024-05-25 14:15:24,249 - INFO - [204800/436063] Loss : 0.2905 +2024-05-25 14:15:32,205 - INFO - [262144/436063] Loss : 0.2915 +2024-05-25 14:15:35,298 - INFO - Test loss : 0.2687 +2024-05-25 14:15:35,298 - INFO - +Epoch 34 +2024-05-25 14:16:08,114 - INFO - [204800/436063] Loss : 0.2902 +2024-05-25 14:16:16,345 - INFO - [262144/436063] Loss : 0.2913 +2024-05-25 14:16:20,065 - INFO - Test loss : 0.2996 +2024-05-25 14:16:20,066 - INFO - +Epoch 35 +2024-05-25 14:16:52,863 - INFO - [204800/436063] Loss : 0.2885 +2024-05-25 14:17:01,027 - INFO - [262144/436063] Loss : 0.2890 +2024-05-25 14:17:04,408 - INFO - Test loss : 0.2824 +2024-05-25 14:17:04,408 - INFO - +Epoch 36 +2024-05-25 14:17:36,158 - INFO - [204800/436063] Loss : 0.2879 +2024-05-25 14:17:44,229 - INFO - [262144/436063] Loss : 0.2887 +2024-05-25 14:17:47,450 - INFO - Test loss : 0.2986 +2024-05-25 14:17:47,450 - INFO - +Epoch 37 +2024-05-25 14:18:19,214 - INFO - [204800/436063] Loss : 0.2854 +2024-05-25 14:18:27,443 - INFO - [262144/436063] Loss : 0.2863 +2024-05-25 14:18:30,648 - INFO - Test loss : 0.3049 +2024-05-25 14:18:30,648 - INFO - +Epoch 38 +2024-05-25 14:19:03,688 - INFO - [204800/436063] Loss : 0.2835 +2024-05-25 14:19:12,481 - INFO - [262144/436063] Loss : 0.2847 +2024-05-25 14:19:16,045 - INFO - Test loss : 0.2744 +2024-05-25 14:19:16,045 - INFO - +Epoch 39 +2024-05-25 14:19:51,684 - INFO - [204800/436063] Loss : 0.2825 +2024-05-25 14:20:00,384 - INFO - [262144/436063] Loss : 0.2832 +2024-05-25 14:20:03,860 - INFO - Test loss : 0.3075 +2024-05-25 14:20:03,860 - INFO - +Epoch 40 +2024-05-25 14:20:35,403 - INFO - [204800/436063] Loss : 0.2819 +2024-05-25 14:20:43,261 - INFO - [262144/436063] Loss : 0.2823 +2024-05-25 14:20:46,693 - INFO - Test loss : 0.2933 +2024-05-25 14:20:47,392 - INFO - (24343,) +2024-05-25 14:21:07,841 - INFO - Split ID: 0 +2024-05-25 14:21:07,850 - INFO - Top 1 LocEnc (Epoch 40)acc (%): 3.74 +2024-05-25 14:21:07,851 - INFO - Top 3 LocEnc (Epoch 40)acc (%): 8.92 +2024-05-25 14:21:07,852 - INFO - Top 5 LocEnc (Epoch 40)acc (%): 13.15 +2024-05-25 14:21:07,853 - INFO - Top 10 LocEnc (Epoch 40)acc (%): 21.02 +2024-05-25 14:21:07,953 - INFO - +No prior +2024-05-25 14:21:08,175 - INFO - (24426,) +2024-05-25 14:21:17,739 - INFO - Split ID: 0 +2024-05-25 14:21:17,740 - INFO - Top 1 (Epoch 40)acc (%): 60.2 +2024-05-25 14:21:17,740 - INFO - Top 3 (Epoch 40)acc (%): 77.9 +2024-05-25 14:21:17,740 - INFO - Top 5 (Epoch 40)acc (%): 83.29 +2024-05-25 14:21:17,740 - INFO - Top 10 (Epoch 40)acc (%): 88.5 +2024-05-25 14:21:41,087 - INFO - Split ID: 0 +2024-05-25 14:21:41,087 - INFO - Top 1 (Epoch 40)acc (%): 73.23 +2024-05-25 14:21:41,087 - INFO - Top 3 (Epoch 40)acc (%): 87.6 +2024-05-25 14:21:41,087 - INFO - Top 5 (Epoch 40)acc (%): 90.95 +2024-05-25 14:21:41,087 - INFO - Top 10 (Epoch 40)acc (%): 94.03 +2024-05-25 14:21:41,088 - INFO - +Epoch 41 +2024-05-25 14:22:12,432 - INFO - [204800/436063] Loss : 0.2801 +2024-05-25 14:22:20,650 - INFO - [262144/436063] Loss : 0.2810 +2024-05-25 14:22:23,964 - INFO - Test loss : 0.3059 +2024-05-25 14:22:23,964 - INFO - +Epoch 42 +2024-05-25 14:22:58,406 - INFO - [204800/436063] Loss : 0.2782 +2024-05-25 14:23:07,072 - INFO - [262144/436063] Loss : 0.2792 +2024-05-25 14:23:10,652 - INFO - Test loss : 0.3069 +2024-05-25 14:23:10,652 - INFO - +Epoch 43 +2024-05-25 14:23:43,823 - INFO - [204800/436063] Loss : 0.2776 +2024-05-25 14:23:52,033 - INFO - [262144/436063] Loss : 0.2786 +2024-05-25 14:23:55,673 - INFO - Test loss : 0.3212 +2024-05-25 14:23:55,673 - INFO - +Epoch 44 +2024-05-25 14:24:29,075 - INFO - [204800/436063] Loss : 0.2773 +2024-05-25 14:24:37,696 - INFO - [262144/436063] Loss : 0.2780 +2024-05-25 14:24:40,795 - INFO - Test loss : 0.3054 +2024-05-25 14:24:40,795 - INFO - +Epoch 45 +2024-05-25 14:25:14,175 - INFO - [204800/436063] Loss : 0.2752 +2024-05-25 14:25:22,937 - INFO - [262144/436063] Loss : 0.2760 +2024-05-25 14:25:26,609 - INFO - Test loss : 0.3236 +2024-05-25 14:25:26,610 - INFO - +Epoch 46 +2024-05-25 14:26:00,650 - INFO - [204800/436063] Loss : 0.2744 +2024-05-25 14:26:09,207 - INFO - [262144/436063] Loss : 0.2750 +2024-05-25 14:26:12,277 - INFO - Test loss : 0.3202 +2024-05-25 14:26:12,277 - INFO - +Epoch 47 +2024-05-25 14:26:46,225 - INFO - [204800/436063] Loss : 0.2729 +2024-05-25 14:26:55,321 - INFO - [262144/436063] Loss : 0.2735 +2024-05-25 14:26:58,923 - INFO - Test loss : 0.3192 +2024-05-25 14:26:58,923 - INFO - +Epoch 48 +2024-05-25 14:27:34,471 - INFO - [204800/436063] Loss : 0.2721 +2024-05-25 14:27:43,236 - INFO - [262144/436063] Loss : 0.2728 +2024-05-25 14:27:46,307 - INFO - Test loss : 0.3309 +2024-05-25 14:27:46,307 - INFO - +Epoch 49 +2024-05-25 14:28:19,132 - INFO - [204800/436063] Loss : 0.2719 +2024-05-25 14:28:27,335 - INFO - [262144/436063] Loss : 0.2726 +2024-05-25 14:28:30,586 - INFO - Test loss : 0.3131 +2024-05-25 14:28:30,587 - INFO - +Epoch 50 +2024-05-25 14:29:03,772 - INFO - [204800/436063] Loss : 0.2697 +2024-05-25 14:29:12,591 - INFO - [262144/436063] Loss : 0.2706 +2024-05-25 14:29:16,273 - INFO - Test loss : 0.3216 +2024-05-25 14:29:16,981 - INFO - (24343,) +2024-05-25 14:29:37,369 - INFO - Split ID: 0 +2024-05-25 14:29:37,378 - INFO - Top 1 LocEnc (Epoch 50)acc (%): 3.87 +2024-05-25 14:29:37,379 - INFO - Top 3 LocEnc (Epoch 50)acc (%): 9.29 +2024-05-25 14:29:37,380 - INFO - Top 5 LocEnc (Epoch 50)acc (%): 13.69 +2024-05-25 14:29:37,381 - INFO - Top 10 LocEnc (Epoch 50)acc (%): 21.77 +2024-05-25 14:29:37,481 - INFO - +No prior +2024-05-25 14:29:37,690 - INFO - (24426,) +2024-05-25 14:29:47,242 - INFO - Split ID: 0 +2024-05-25 14:29:47,242 - INFO - Top 1 (Epoch 50)acc (%): 60.2 +2024-05-25 14:29:47,242 - INFO - Top 3 (Epoch 50)acc (%): 77.9 +2024-05-25 14:29:47,242 - INFO - Top 5 (Epoch 50)acc (%): 83.29 +2024-05-25 14:29:47,242 - INFO - Top 10 (Epoch 50)acc (%): 88.5 +2024-05-25 14:30:10,876 - INFO - Split ID: 0 +2024-05-25 14:30:10,877 - INFO - Top 1 (Epoch 50)acc (%): 73.29 +2024-05-25 14:30:10,877 - INFO - Top 3 (Epoch 50)acc (%): 87.61 +2024-05-25 14:30:10,877 - INFO - Top 5 (Epoch 50)acc (%): 90.94 +2024-05-25 14:30:10,877 - INFO - Top 10 (Epoch 50)acc (%): 93.95 +2024-05-25 14:30:10,877 - INFO - +Epoch 51 +2024-05-25 14:30:44,417 - INFO - [204800/436063] Loss : 0.2695 +2024-05-25 14:30:53,279 - INFO - [262144/436063] Loss : 0.2699 +2024-05-25 14:30:56,886 - INFO - Test loss : 0.3333 +2024-05-25 14:30:56,886 - INFO - +Epoch 52 +2024-05-25 14:31:30,398 - INFO - [204800/436063] Loss : 0.2682 +2024-05-25 14:31:38,986 - INFO - [262144/436063] Loss : 0.2690 +2024-05-25 14:31:42,576 - INFO - Test loss : 0.3269 +2024-05-25 14:31:42,578 - INFO - +Epoch 53 +2024-05-25 14:32:15,478 - INFO - [204800/436063] Loss : 0.2679 +2024-05-25 14:32:23,847 - INFO - [262144/436063] Loss : 0.2683 +2024-05-25 14:32:27,314 - INFO - Test loss : 0.3471 +2024-05-25 14:32:27,314 - INFO - +Epoch 54 +2024-05-25 14:33:00,614 - INFO - [204800/436063] Loss : 0.2663 +2024-05-25 14:33:09,789 - INFO - [262144/436063] Loss : 0.2674 +2024-05-25 14:33:13,276 - INFO - Test loss : 0.3280 +2024-05-25 14:33:13,276 - INFO - +Epoch 55 +2024-05-25 14:33:46,228 - INFO - [204800/436063] Loss : 0.2651 +2024-05-25 14:33:54,999 - INFO - [262144/436063] Loss : 0.2653 +2024-05-25 14:33:58,454 - INFO - Test loss : 0.3448 +2024-05-25 14:33:58,454 - INFO - +Epoch 56 +2024-05-25 14:34:31,201 - INFO - [204800/436063] Loss : 0.2650 +2024-05-25 14:34:39,646 - INFO - [262144/436063] Loss : 0.2654 +2024-05-25 14:34:43,346 - INFO - Test loss : 0.3485 +2024-05-25 14:34:43,346 - INFO - +Epoch 57 +2024-05-25 14:35:15,673 - INFO - [204800/436063] Loss : 0.2632 +2024-05-25 14:35:23,929 - INFO - [262144/436063] Loss : 0.2640 +2024-05-25 14:35:27,090 - INFO - Test loss : 0.3628 +2024-05-25 14:35:27,090 - INFO - +Epoch 58 +2024-05-25 14:36:01,364 - INFO - [204800/436063] Loss : 0.2625 +2024-05-25 14:36:10,465 - INFO - [262144/436063] Loss : 0.2636 +2024-05-25 14:36:14,000 - INFO - Test loss : 0.3654 +2024-05-25 14:36:14,000 - INFO - +Epoch 59 +2024-05-25 14:36:49,126 - INFO - [204800/436063] Loss : 0.2635 +2024-05-25 14:36:57,904 - INFO - [262144/436063] Loss : 0.2637 +2024-05-25 14:37:01,231 - INFO - Test loss : 0.3590 +2024-05-25 14:37:01,231 - INFO - +Epoch 60 +2024-05-25 14:37:32,911 - INFO - [204800/436063] Loss : 0.2614 +2024-05-25 14:37:41,178 - INFO - [262144/436063] Loss : 0.2615 +2024-05-25 14:37:44,352 - INFO - Test loss : 0.3781 +2024-05-25 14:37:45,070 - INFO - (24343,) +2024-05-25 14:38:05,460 - INFO - Split ID: 0 +2024-05-25 14:38:05,469 - INFO - Top 1 LocEnc (Epoch 60)acc (%): 4.54 +2024-05-25 14:38:05,470 - INFO - Top 3 LocEnc (Epoch 60)acc (%): 10.28 +2024-05-25 14:38:05,471 - INFO - Top 5 LocEnc (Epoch 60)acc (%): 14.64 +2024-05-25 14:38:05,472 - INFO - Top 10 LocEnc (Epoch 60)acc (%): 23.11 +2024-05-25 14:38:05,577 - INFO - +No prior +2024-05-25 14:38:05,799 - INFO - (24426,) +2024-05-25 14:38:15,360 - INFO - Split ID: 0 +2024-05-25 14:38:15,360 - INFO - Top 1 (Epoch 60)acc (%): 60.2 +2024-05-25 14:38:15,360 - INFO - Top 3 (Epoch 60)acc (%): 77.9 +2024-05-25 14:38:15,360 - INFO - Top 5 (Epoch 60)acc (%): 83.29 +2024-05-25 14:38:15,360 - INFO - Top 10 (Epoch 60)acc (%): 88.5 +2024-05-25 14:38:39,115 - INFO - Split ID: 0 +2024-05-25 14:38:39,116 - INFO - Top 1 (Epoch 60)acc (%): 73.57 +2024-05-25 14:38:39,116 - INFO - Top 3 (Epoch 60)acc (%): 87.65 +2024-05-25 14:38:39,116 - INFO - Top 5 (Epoch 60)acc (%): 90.87 +2024-05-25 14:38:39,116 - INFO - Top 10 (Epoch 60)acc (%): 93.85 +2024-05-25 14:38:39,116 - INFO - +Epoch 61 +2024-05-25 14:39:11,950 - INFO - [204800/436063] Loss : 0.2602 +2024-05-25 14:39:20,269 - INFO - [262144/436063] Loss : 0.2606 +2024-05-25 14:39:23,383 - INFO - Test loss : 0.3661 +2024-05-25 14:39:23,384 - INFO - +Epoch 62 +2024-05-25 14:39:57,023 - INFO - [204800/436063] Loss : 0.2600 +2024-05-25 14:40:06,086 - INFO - [262144/436063] Loss : 0.2606 +2024-05-25 14:40:09,653 - INFO - Test loss : 0.3538 +2024-05-25 14:40:09,654 - INFO - +Epoch 63 +2024-05-25 14:40:44,325 - INFO - [204800/436063] Loss : 0.2587 +2024-05-25 14:40:53,226 - INFO - [262144/436063] Loss : 0.2596 +2024-05-25 14:40:56,827 - INFO - Test loss : 0.3623 +2024-05-25 14:40:56,827 - INFO - +Epoch 64 +2024-05-25 14:41:32,169 - INFO - [204800/436063] Loss : 0.2585 +2024-05-25 14:41:40,671 - INFO - [262144/436063] Loss : 0.2592 +2024-05-25 14:41:44,199 - INFO - Test loss : 0.3705 +2024-05-25 14:41:44,199 - INFO - +Epoch 65 +2024-05-25 14:42:17,164 - INFO - [204800/436063] Loss : 0.2576 +2024-05-25 14:42:25,363 - INFO - [262144/436063] Loss : 0.2581 +2024-05-25 14:42:28,495 - INFO - Test loss : 0.3615 +2024-05-25 14:42:28,495 - INFO - +Epoch 66 +2024-05-25 14:43:00,600 - INFO - [204800/436063] Loss : 0.2565 +2024-05-25 14:43:09,127 - INFO - [262144/436063] Loss : 0.2567 +2024-05-25 14:43:12,665 - INFO - Test loss : 0.3717 +2024-05-25 14:43:12,665 - INFO - +Epoch 67 +2024-05-25 14:43:46,965 - INFO - [204800/436063] Loss : 0.2560 +2024-05-25 14:43:55,309 - INFO - [262144/436063] Loss : 0.2566 +2024-05-25 14:43:58,641 - INFO - Test loss : 0.3782 +2024-05-25 14:43:58,642 - INFO - +Epoch 68 +2024-05-25 14:44:30,515 - INFO - [204800/436063] Loss : 0.2551 +2024-05-25 14:44:39,731 - INFO - [262144/436063] Loss : 0.2556 +2024-05-25 14:44:43,040 - INFO - Test loss : 0.3730 +2024-05-25 14:44:43,040 - INFO - +Epoch 69 +2024-05-25 14:45:15,553 - INFO - [204800/436063] Loss : 0.2537 +2024-05-25 14:45:23,526 - INFO - [262144/436063] Loss : 0.2544 +2024-05-25 14:45:26,662 - INFO - Test loss : 0.3568 +2024-05-25 14:45:26,662 - INFO - +Epoch 70 +2024-05-25 14:45:59,063 - INFO - [204800/436063] Loss : 0.2542 +2024-05-25 14:46:07,468 - INFO - [262144/436063] Loss : 0.2547 +2024-05-25 14:46:10,840 - INFO - Test loss : 0.3791 +2024-05-25 14:46:11,541 - INFO - (24343,) +2024-05-25 14:46:31,917 - INFO - Split ID: 0 +2024-05-25 14:46:31,926 - INFO - Top 1 LocEnc (Epoch 70)acc (%): 4.94 +2024-05-25 14:46:31,927 - INFO - Top 3 LocEnc (Epoch 70)acc (%): 11.06 +2024-05-25 14:46:31,928 - INFO - Top 5 LocEnc (Epoch 70)acc (%): 15.59 +2024-05-25 14:46:31,929 - INFO - Top 10 LocEnc (Epoch 70)acc (%): 24.01 +2024-05-25 14:46:32,036 - INFO - +No prior +2024-05-25 14:46:32,259 - INFO - (24426,) +2024-05-25 14:46:41,839 - INFO - Split ID: 0 +2024-05-25 14:46:41,840 - INFO - Top 1 (Epoch 70)acc (%): 60.2 +2024-05-25 14:46:41,840 - INFO - Top 3 (Epoch 70)acc (%): 77.9 +2024-05-25 14:46:41,840 - INFO - Top 5 (Epoch 70)acc (%): 83.29 +2024-05-25 14:46:41,840 - INFO - Top 10 (Epoch 70)acc (%): 88.5 +2024-05-25 14:47:05,280 - INFO - Split ID: 0 +2024-05-25 14:47:05,280 - INFO - Top 1 (Epoch 70)acc (%): 73.53 +2024-05-25 14:47:05,281 - INFO - Top 3 (Epoch 70)acc (%): 87.58 +2024-05-25 14:47:05,281 - INFO - Top 5 (Epoch 70)acc (%): 90.83 +2024-05-25 14:47:05,281 - INFO - Top 10 (Epoch 70)acc (%): 93.87 +2024-05-25 14:47:05,281 - INFO - +Epoch 71 +2024-05-25 14:47:39,395 - INFO - [204800/436063] Loss : 0.2537 +2024-05-25 14:47:47,625 - INFO - [262144/436063] Loss : 0.2540 +2024-05-25 14:47:50,961 - INFO - Test loss : 0.3797 +2024-05-25 14:47:50,961 - INFO - +Epoch 72 +2024-05-25 14:48:24,263 - INFO - [204800/436063] Loss : 0.2525 +2024-05-25 14:48:32,658 - INFO - [262144/436063] Loss : 0.2536 +2024-05-25 14:48:36,058 - INFO - Test loss : 0.3802 +2024-05-25 14:48:36,058 - INFO - +Epoch 73 +2024-05-25 14:49:08,132 - INFO - [204800/436063] Loss : 0.2518 +2024-05-25 14:49:16,423 - INFO - [262144/436063] Loss : 0.2524 +2024-05-25 14:49:19,761 - INFO - Test loss : 0.3866 +2024-05-25 14:49:19,761 - INFO - +Epoch 74 +2024-05-25 14:49:53,610 - INFO - [204800/436063] Loss : 0.2513 +2024-05-25 14:50:02,227 - INFO - [262144/436063] Loss : 0.2519 +2024-05-25 14:50:05,552 - INFO - Test loss : 0.3902 +2024-05-25 14:50:05,552 - INFO - +Epoch 75 +2024-05-25 14:50:37,490 - INFO - [204800/436063] Loss : 0.2514 +2024-05-25 14:50:46,177 - INFO - [262144/436063] Loss : 0.2515 +2024-05-25 14:50:49,664 - INFO - Test loss : 0.3887 +2024-05-25 14:50:49,664 - INFO - +Epoch 76 +2024-05-25 14:51:22,500 - INFO - [204800/436063] Loss : 0.2503 +2024-05-25 14:51:30,719 - INFO - [262144/436063] Loss : 0.2507 +2024-05-25 14:51:34,039 - INFO - Test loss : 0.3757 +2024-05-25 14:51:34,039 - INFO - +Epoch 77 +2024-05-25 14:52:05,547 - INFO - [204800/436063] Loss : 0.2498 +2024-05-25 14:52:13,513 - INFO - [262144/436063] Loss : 0.2503 +2024-05-25 14:52:16,957 - INFO - Test loss : 0.3947 +2024-05-25 14:52:16,958 - INFO - +Epoch 78 +2024-05-25 14:52:51,375 - INFO - [204800/436063] Loss : 0.2484 +2024-05-25 14:53:00,916 - INFO - [262144/436063] Loss : 0.2486 +2024-05-25 14:53:04,521 - INFO - Test loss : 0.3967 +2024-05-25 14:53:04,521 - INFO - +Epoch 79 +2024-05-25 14:53:37,546 - INFO - [204800/436063] Loss : 0.2481 +2024-05-25 14:53:45,793 - INFO - [262144/436063] Loss : 0.2485 +2024-05-25 14:53:49,088 - INFO - Test loss : 0.4166 +2024-05-25 14:53:49,088 - INFO - +Epoch 80 +2024-05-25 14:54:22,087 - INFO - [204800/436063] Loss : 0.2473 +2024-05-25 14:54:30,978 - INFO - [262144/436063] Loss : 0.2480 +2024-05-25 14:54:34,660 - INFO - Test loss : 0.3973 +2024-05-25 14:54:35,381 - INFO - (24343,) +2024-05-25 14:54:55,882 - INFO - Split ID: 0 +2024-05-25 14:54:55,891 - INFO - Top 1 LocEnc (Epoch 80)acc (%): 4.9 +2024-05-25 14:54:55,892 - INFO - Top 3 LocEnc (Epoch 80)acc (%): 11.22 +2024-05-25 14:54:55,893 - INFO - Top 5 LocEnc (Epoch 80)acc (%): 16.02 +2024-05-25 14:54:55,894 - INFO - Top 10 LocEnc (Epoch 80)acc (%): 24.33 +2024-05-25 14:54:55,993 - INFO - +No prior +2024-05-25 14:54:56,202 - INFO - (24426,) +2024-05-25 14:55:05,841 - INFO - Split ID: 0 +2024-05-25 14:55:05,841 - INFO - Top 1 (Epoch 80)acc (%): 60.2 +2024-05-25 14:55:05,842 - INFO - Top 3 (Epoch 80)acc (%): 77.9 +2024-05-25 14:55:05,842 - INFO - Top 5 (Epoch 80)acc (%): 83.29 +2024-05-25 14:55:05,842 - INFO - Top 10 (Epoch 80)acc (%): 88.5 +2024-05-25 14:55:29,604 - INFO - Split ID: 0 +2024-05-25 14:55:29,604 - INFO - Top 1 (Epoch 80)acc (%): 73.47 +2024-05-25 14:55:29,604 - INFO - Top 3 (Epoch 80)acc (%): 87.58 +2024-05-25 14:55:29,604 - INFO - Top 5 (Epoch 80)acc (%): 90.8 +2024-05-25 14:55:29,604 - INFO - Top 10 (Epoch 80)acc (%): 93.83 +2024-05-25 14:55:29,605 - INFO - +Epoch 81 +2024-05-25 14:56:03,149 - INFO - [204800/436063] Loss : 0.2479 +2024-05-25 14:56:11,248 - INFO - [262144/436063] Loss : 0.2483 +2024-05-25 14:56:14,611 - INFO - Test loss : 0.3979 +2024-05-25 14:56:14,611 - INFO - +Epoch 82 +2024-05-25 14:56:46,207 - INFO - [204800/436063] Loss : 0.2461 +2024-05-25 14:56:54,373 - INFO - [262144/436063] Loss : 0.2468 +2024-05-25 14:56:57,651 - INFO - Test loss : 0.4113 +2024-05-25 14:56:57,652 - INFO - +Epoch 83 +2024-05-25 14:57:32,463 - INFO - [204800/436063] Loss : 0.2461 +2024-05-25 14:57:41,224 - INFO - [262144/436063] Loss : 0.2464 +2024-05-25 14:57:44,468 - INFO - Test loss : 0.3978 +2024-05-25 14:57:44,469 - INFO - +Epoch 84 +2024-05-25 14:58:18,307 - INFO - [204800/436063] Loss : 0.2456 +2024-05-25 14:58:27,110 - INFO - [262144/436063] Loss : 0.2458 +2024-05-25 14:58:30,732 - INFO - Test loss : 0.4110 +2024-05-25 14:58:30,733 - INFO - +Epoch 85 +2024-05-25 14:59:03,476 - INFO - [204800/436063] Loss : 0.2446 +2024-05-25 14:59:11,696 - INFO - [262144/436063] Loss : 0.2448 +2024-05-25 14:59:15,165 - INFO - Test loss : 0.4138 +2024-05-25 14:59:15,165 - INFO - +Epoch 86 +2024-05-25 14:59:49,290 - INFO - [204800/436063] Loss : 0.2441 +2024-05-25 14:59:58,205 - INFO - [262144/436063] Loss : 0.2448 +2024-05-25 15:00:01,933 - INFO - Test loss : 0.4054 +2024-05-25 15:00:01,933 - INFO - +Epoch 87 +2024-05-25 15:00:36,625 - INFO - [204800/436063] Loss : 0.2443 +2024-05-25 15:00:44,894 - INFO - [262144/436063] Loss : 0.2444 +2024-05-25 15:00:48,373 - INFO - Test loss : 0.4172 +2024-05-25 15:00:48,374 - INFO - +Epoch 88 +2024-05-25 15:01:20,600 - INFO - [204800/436063] Loss : 0.2440 +2024-05-25 15:01:28,761 - INFO - [262144/436063] Loss : 0.2443 +2024-05-25 15:01:31,994 - INFO - Test loss : 0.4135 +2024-05-25 15:01:31,994 - INFO - +Epoch 89 +2024-05-25 15:02:05,725 - INFO - [204800/436063] Loss : 0.2433 +2024-05-25 15:02:13,800 - INFO - [262144/436063] Loss : 0.2435 +2024-05-25 15:02:16,996 - INFO - Test loss : 0.4212 +2024-05-25 15:02:16,996 - INFO - +Epoch 90 +2024-05-25 15:02:52,270 - INFO - [204800/436063] Loss : 0.2422 +2024-05-25 15:03:01,121 - INFO - [262144/436063] Loss : 0.2424 +2024-05-25 15:03:04,225 - INFO - Test loss : 0.4274 +2024-05-25 15:03:04,927 - INFO - (24343,) +2024-05-25 15:03:25,333 - INFO - Split ID: 0 +2024-05-25 15:03:25,342 - INFO - Top 1 LocEnc (Epoch 90)acc (%): 5.02 +2024-05-25 15:03:25,343 - INFO - Top 3 LocEnc (Epoch 90)acc (%): 11.41 +2024-05-25 15:03:25,344 - INFO - Top 5 LocEnc (Epoch 90)acc (%): 16.19 +2024-05-25 15:03:25,345 - INFO - Top 10 LocEnc (Epoch 90)acc (%): 24.89 +2024-05-25 15:03:25,446 - INFO - +No prior +2024-05-25 15:03:25,660 - INFO - (24426,) +2024-05-25 15:03:35,211 - INFO - Split ID: 0 +2024-05-25 15:03:35,211 - INFO - Top 1 (Epoch 90)acc (%): 60.2 +2024-05-25 15:03:35,211 - INFO - Top 3 (Epoch 90)acc (%): 77.9 +2024-05-25 15:03:35,211 - INFO - Top 5 (Epoch 90)acc (%): 83.29 +2024-05-25 15:03:35,211 - INFO - Top 10 (Epoch 90)acc (%): 88.5 +2024-05-25 15:03:58,789 - INFO - Split ID: 0 +2024-05-25 15:03:58,790 - INFO - Top 1 (Epoch 90)acc (%): 73.45 +2024-05-25 15:03:58,790 - INFO - Top 3 (Epoch 90)acc (%): 87.42 +2024-05-25 15:03:58,790 - INFO - Top 5 (Epoch 90)acc (%): 90.63 +2024-05-25 15:03:58,790 - INFO - Top 10 (Epoch 90)acc (%): 93.67 +2024-05-25 15:03:58,790 - INFO - +Epoch 91 +2024-05-25 15:04:31,688 - INFO - [204800/436063] Loss : 0.2420 +2024-05-25 15:04:40,293 - INFO - [262144/436063] Loss : 0.2420 +2024-05-25 15:04:43,773 - INFO - Test loss : 0.4089 +2024-05-25 15:04:43,773 - INFO - +Epoch 92 +2024-05-25 15:05:16,475 - INFO - [204800/436063] Loss : 0.2419 +2024-05-25 15:05:24,762 - INFO - [262144/436063] Loss : 0.2423 +2024-05-25 15:05:28,094 - INFO - Test loss : 0.4181 +2024-05-25 15:05:28,095 - INFO - +Epoch 93 +2024-05-25 15:06:01,407 - INFO - [204800/436063] Loss : 0.2415 +2024-05-25 15:06:09,505 - INFO - [262144/436063] Loss : 0.2417 +2024-05-25 15:06:12,709 - INFO - Test loss : 0.4223 +2024-05-25 15:06:12,710 - INFO - +Epoch 94 +2024-05-25 15:06:46,420 - INFO - [204800/436063] Loss : 0.2402 +2024-05-25 15:06:54,867 - INFO - [262144/436063] Loss : 0.2408 +2024-05-25 15:06:58,132 - INFO - Test loss : 0.4275 +2024-05-25 15:06:58,132 - INFO - +Epoch 95 +2024-05-25 15:07:30,039 - INFO - [204800/436063] Loss : 0.2404 +2024-05-25 15:07:38,033 - INFO - [262144/436063] Loss : 0.2408 +2024-05-25 15:07:41,278 - INFO - Test loss : 0.4326 +2024-05-25 15:07:41,278 - INFO - +Epoch 96 +2024-05-25 15:08:13,601 - INFO - [204800/436063] Loss : 0.2392 +2024-05-25 15:08:22,422 - INFO - [262144/436063] Loss : 0.2396 +2024-05-25 15:08:26,021 - INFO - Test loss : 0.4290 +2024-05-25 15:08:26,021 - INFO - +Epoch 97 +2024-05-25 15:09:01,012 - INFO - [204800/436063] Loss : 0.2394 +2024-05-25 15:09:09,826 - INFO - [262144/436063] Loss : 0.2396 +2024-05-25 15:09:13,338 - INFO - Test loss : 0.4290 +2024-05-25 15:09:13,338 - INFO - +Epoch 98 +2024-05-25 15:09:45,685 - INFO - [204800/436063] Loss : 0.2397 +2024-05-25 15:09:53,791 - INFO - [262144/436063] Loss : 0.2398 +2024-05-25 15:09:57,253 - INFO - Test loss : 0.4344 +2024-05-25 15:09:57,253 - INFO - +Epoch 99 +2024-05-25 15:10:30,197 - INFO - [204800/436063] Loss : 0.2389 +2024-05-25 15:10:40,564 - INFO - [262144/436063] Loss : 0.2394 +2024-05-25 15:10:44,879 - INFO - Test loss : 0.4375 +2024-05-25 15:10:44,879 - INFO - Saving output model to ../models/space2vec_theory/model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-25 15:10:44,943 - INFO - Saving output model to ../models/space2vec_theory/model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-25 15:10:46,959 - INFO - +No prior +2024-05-25 15:10:53,403 - INFO - (24426,) +2024-05-25 15:11:16,449 - INFO - Split ID: 0 +2024-05-25 15:11:16,450 - INFO - Top 1 acc (%): 60.2 +2024-05-25 15:11:16,450 - INFO - Top 3 acc (%): 77.9 +2024-05-25 15:11:16,450 - INFO - Top 5 acc (%): 83.29 +2024-05-25 15:11:16,450 - INFO - Top 10 acc (%): 88.5 +2024-05-25 15:11:40,945 - INFO - Split ID: 0 +2024-05-25 15:11:40,945 - INFO - Top 1 acc (%): 73.52 +2024-05-25 15:11:40,945 - INFO - Top 3 acc (%): 87.37 +2024-05-25 15:11:40,945 - INFO - Top 5 acc (%): 90.62 +2024-05-25 15:11:40,945 - INFO - Top 10 acc (%): 93.64 +2024-05-25 15:11:40,946 - INFO - +Space2Vec-theory +2024-05-25 15:11:40,946 - INFO - Model : model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-25 15:11:41,717 - INFO - (24343,) +2024-05-25 15:12:02,123 - INFO - Split ID: 0 +2024-05-25 15:12:02,132 - INFO - Top 1 LocEnc acc (%): 5.4 +2024-05-25 15:12:02,133 - INFO - Top 3 LocEnc acc (%): 11.91 +2024-05-25 15:12:02,134 - INFO - Top 5 LocEnc acc (%): 16.67 +2024-05-25 15:12:02,135 - INFO - Top 10 LocEnc acc (%): 25.42 +2024-05-31 01:53:00,293 - INFO - +num_classes 8142 +2024-05-31 01:53:00,293 - INFO - num train 436063 +2024-05-31 01:53:00,293 - INFO - num val 24343 +2024-05-31 01:53:00,293 - INFO - train loss full_loss +2024-05-31 01:53:00,293 - INFO - model name ../models/space2vec_theory/model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:53:00,293 - INFO - num users 18643 +2024-05-31 01:53:01,186 - INFO - +Only Space2Vec-theory +2024-05-31 01:53:01,187 - INFO - Model : model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:53:01,501 - INFO - Saving output model to ../models/space2vec_theory/model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:53:01,712 - INFO - Saving output model to ../models/space2vec_theory/model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:53:02,046 - INFO - +No prior +2024-05-31 01:53:02,277 - INFO - (24426,) +2024-05-31 01:53:19,462 - INFO - Save results to ../eval_results/eval_inat_2018__val_no_prior.csv +2024-05-31 01:53:19,462 - INFO - Split ID: 0 +2024-05-31 01:53:19,462 - INFO - Top 1 acc (%): 60.2 +2024-05-31 01:53:19,462 - INFO - Top 3 acc (%): 77.9 +2024-05-31 01:53:19,462 - INFO - Top 5 acc (%): 83.29 +2024-05-31 01:53:19,463 - INFO - Top 10 acc (%): 88.5 +2024-05-31 01:53:45,419 - INFO - Split ID: 0 +2024-05-31 01:53:45,419 - INFO - Top 1 hit (%): 73.52 +2024-05-31 01:53:45,419 - INFO - Top 3 hit (%): 87.37 +2024-05-31 01:53:45,419 - INFO - Top 5 hit (%): 90.62 +2024-05-31 01:53:45,419 - INFO - Top 10 hit (%): 93.64 +2024-05-31 01:53:45,426 - INFO - +Only Space2Vec-theory +2024-05-31 01:53:45,426 - INFO - Model : model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:53:46,187 - INFO - (24343,) +2024-05-31 01:54:06,271 - INFO - Split ID: 0 +2024-05-31 01:54:06,279 - INFO - Top 1 LocEnc acc (%): 5.4 +2024-05-31 01:54:06,280 - INFO - Top 3 LocEnc acc (%): 11.91 +2024-05-31 01:54:06,281 - INFO - Top 5 LocEnc acc (%): 16.67 +2024-05-31 01:54:06,282 - INFO - Top 10 LocEnc acc (%): 25.42 diff --git a/pre_trained_models/space2vec_theory/model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/space2vec_theory/model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..e0d934f6 Binary files /dev/null and b/pre_trained_models/space2vec_theory/model_inat_2018_Space2Vec-theory_0.0200_64_0.0500000_360.000_1_512_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_theory/model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.log b/pre_trained_models/space2vec_theory/model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..6abb3481 --- /dev/null +++ b/pre_trained_models/space2vec_theory/model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.log @@ -0,0 +1,226 @@ +2024-05-23 17:15:55,932 - INFO - +num_classes 555 +2024-05-23 17:15:55,932 - INFO - num train 22599 +2024-05-23 17:15:55,932 - INFO - num val 1100 +2024-05-23 17:15:55,932 - INFO - train loss full_loss +2024-05-23 17:15:55,932 - INFO - model name ../models/space2vec_theory/model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 17:15:55,932 - INFO - num users 5331 +2024-05-23 17:15:55,933 - INFO - meta data ebird_meta +2024-05-23 17:15:56,708 - INFO - +Epoch 0 +2024-05-23 17:16:17,113 - INFO - [20480/22599] Loss : 1.6591 +2024-05-23 17:16:18,129 - INFO - Test loss : 1.0968 +2024-05-23 17:16:18,129 - INFO - +Epoch 1 +2024-05-23 17:16:38,507 - INFO - [20480/22599] Loss : 1.3185 +2024-05-23 17:16:39,528 - INFO - Test loss : 0.3637 +2024-05-23 17:16:39,529 - INFO - +Epoch 2 +2024-05-23 17:16:59,951 - INFO - [20480/22599] Loss : 1.1092 +2024-05-23 17:17:00,953 - INFO - Test loss : 0.5217 +2024-05-23 17:17:00,953 - INFO - +Epoch 3 +2024-05-23 17:17:21,297 - INFO - [20480/22599] Loss : 1.0390 +2024-05-23 17:17:22,103 - INFO - Test loss : 0.4443 +2024-05-23 17:17:22,104 - INFO - +Epoch 4 +2024-05-23 17:17:42,393 - INFO - [20480/22599] Loss : 0.9929 +2024-05-23 17:17:43,369 - INFO - Test loss : 0.3695 +2024-05-23 17:17:43,369 - INFO - +Epoch 5 +2024-05-23 17:18:04,065 - INFO - [20480/22599] Loss : 0.9580 +2024-05-23 17:18:05,086 - INFO - Test loss : 0.4628 +2024-05-23 17:18:05,086 - INFO - +Epoch 6 +2024-05-23 17:18:25,364 - INFO - [20480/22599] Loss : 0.9334 +2024-05-23 17:18:26,379 - INFO - Test loss : 0.4225 +2024-05-23 17:18:26,379 - INFO - +Epoch 7 +2024-05-23 17:18:46,610 - INFO - [20480/22599] Loss : 0.9118 +2024-05-23 17:18:47,626 - INFO - Test loss : 0.4542 +2024-05-23 17:18:47,627 - INFO - +Epoch 8 +2024-05-23 17:19:07,850 - INFO - [20480/22599] Loss : 0.8945 +2024-05-23 17:19:08,866 - INFO - Test loss : 0.4696 +2024-05-23 17:19:08,866 - INFO - +Epoch 9 +2024-05-23 17:19:29,236 - INFO - [20480/22599] Loss : 0.8801 +2024-05-23 17:19:29,790 - INFO - Test loss : 0.4740 +2024-05-23 17:19:29,790 - INFO - +Epoch 10 +2024-05-23 17:19:50,135 - INFO - [20480/22599] Loss : 0.8690 +2024-05-23 17:19:51,079 - INFO - Test loss : 0.4967 +2024-05-23 17:19:51,091 - INFO - (1100,) +2024-05-23 17:19:51,136 - INFO - Split ID: 0 +2024-05-23 17:19:51,137 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 2.64 +2024-05-23 17:19:51,137 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 6.82 +2024-05-23 17:19:51,137 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 10.64 +2024-05-23 17:19:51,137 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 17.09 +2024-05-23 17:19:51,138 - INFO - +No prior +2024-05-23 17:19:51,165 - INFO - (24633,) +2024-05-23 17:19:51,835 - INFO - Split ID: 0 +2024-05-23 17:19:51,836 - INFO - Top 1 (Epoch 10)acc (%): 76.08 +2024-05-23 17:19:51,836 - INFO - Top 3 (Epoch 10)acc (%): 90.98 +2024-05-23 17:19:51,836 - INFO - Top 5 (Epoch 10)acc (%): 94.06 +2024-05-23 17:19:51,836 - INFO - Top 10 (Epoch 10)acc (%): 96.83 +2024-05-23 17:20:21,446 - INFO - Split ID: 0 +2024-05-23 17:20:21,447 - INFO - Top 1 (Epoch 10)acc (%): 81.46 +2024-05-23 17:20:21,447 - INFO - Top 3 (Epoch 10)acc (%): 93.53 +2024-05-23 17:20:21,447 - INFO - Top 5 (Epoch 10)acc (%): 95.86 +2024-05-23 17:20:21,447 - INFO - Top 10 (Epoch 10)acc (%): 97.9 +2024-05-23 17:20:21,450 - INFO - +Epoch 11 +2024-05-23 17:20:41,533 - INFO - [20480/22599] Loss : 0.8539 +2024-05-23 17:20:42,356 - INFO - Test loss : 0.5089 +2024-05-23 17:20:42,356 - INFO - +Epoch 12 +2024-05-23 17:21:02,893 - INFO - [20480/22599] Loss : 0.8407 +2024-05-23 17:21:03,889 - INFO - Test loss : 0.5385 +2024-05-23 17:21:03,889 - INFO - +Epoch 13 +2024-05-23 17:21:24,624 - INFO - [20480/22599] Loss : 0.8333 +2024-05-23 17:21:25,587 - INFO - Test loss : 0.5290 +2024-05-23 17:21:25,587 - INFO - +Epoch 14 +2024-05-23 17:21:46,058 - INFO - [20480/22599] Loss : 0.8221 +2024-05-23 17:21:47,074 - INFO - Test loss : 0.5707 +2024-05-23 17:21:47,074 - INFO - +Epoch 15 +2024-05-23 17:22:07,063 - INFO - [20480/22599] Loss : 0.8155 +2024-05-23 17:22:08,079 - INFO - Test loss : 0.5718 +2024-05-23 17:22:08,080 - INFO - +Epoch 16 +2024-05-23 17:22:28,384 - INFO - [20480/22599] Loss : 0.8056 +2024-05-23 17:22:29,402 - INFO - Test loss : 0.5945 +2024-05-23 17:22:29,402 - INFO - +Epoch 17 +2024-05-23 17:22:50,071 - INFO - [20480/22599] Loss : 0.7990 +2024-05-23 17:22:51,091 - INFO - Test loss : 0.5999 +2024-05-23 17:22:51,091 - INFO - +Epoch 18 +2024-05-23 17:23:11,167 - INFO - [20480/22599] Loss : 0.7918 +2024-05-23 17:23:12,187 - INFO - Test loss : 0.6236 +2024-05-23 17:23:12,187 - INFO - +Epoch 19 +2024-05-23 17:23:29,273 - INFO - [20480/22599] Loss : 0.7845 +2024-05-23 17:23:29,629 - INFO - Test loss : 0.6282 +2024-05-23 17:23:29,630 - INFO - +Epoch 20 +2024-05-23 17:23:49,787 - INFO - [20480/22599] Loss : 0.7820 +2024-05-23 17:23:50,789 - INFO - Test loss : 0.6291 +2024-05-23 17:23:50,798 - INFO - (1100,) +2024-05-23 17:23:50,843 - INFO - Split ID: 0 +2024-05-23 17:23:50,844 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 2.91 +2024-05-23 17:23:50,844 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 7.64 +2024-05-23 17:23:50,844 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 10.73 +2024-05-23 17:23:50,844 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 17.18 +2024-05-23 17:23:50,845 - INFO - +No prior +2024-05-23 17:23:50,868 - INFO - (24633,) +2024-05-23 17:23:51,560 - INFO - Split ID: 0 +2024-05-23 17:23:51,560 - INFO - Top 1 (Epoch 20)acc (%): 76.08 +2024-05-23 17:23:51,560 - INFO - Top 3 (Epoch 20)acc (%): 90.98 +2024-05-23 17:23:51,560 - INFO - Top 5 (Epoch 20)acc (%): 94.06 +2024-05-23 17:23:51,560 - INFO - Top 10 (Epoch 20)acc (%): 96.83 +2024-05-23 17:24:21,077 - INFO - Split ID: 0 +2024-05-23 17:24:21,077 - INFO - Top 1 (Epoch 20)acc (%): 81.81 +2024-05-23 17:24:21,078 - INFO - Top 3 (Epoch 20)acc (%): 93.52 +2024-05-23 17:24:21,078 - INFO - Top 5 (Epoch 20)acc (%): 95.86 +2024-05-23 17:24:21,078 - INFO - Top 10 (Epoch 20)acc (%): 97.8 +2024-05-23 17:24:21,078 - INFO - +Epoch 21 +2024-05-23 17:24:41,469 - INFO - [20480/22599] Loss : 0.7761 +2024-05-23 17:24:42,474 - INFO - Test loss : 0.6548 +2024-05-23 17:24:42,474 - INFO - +Epoch 22 +2024-05-23 17:25:02,886 - INFO - [20480/22599] Loss : 0.7739 +2024-05-23 17:25:03,896 - INFO - Test loss : 0.6586 +2024-05-23 17:25:03,896 - INFO - +Epoch 23 +2024-05-23 17:25:24,281 - INFO - [20480/22599] Loss : 0.7664 +2024-05-23 17:25:25,299 - INFO - Test loss : 0.6870 +2024-05-23 17:25:25,299 - INFO - +Epoch 24 +2024-05-23 17:25:45,933 - INFO - [20480/22599] Loss : 0.7584 +2024-05-23 17:25:46,951 - INFO - Test loss : 0.6965 +2024-05-23 17:25:46,951 - INFO - +Epoch 25 +2024-05-23 17:26:07,317 - INFO - [20480/22599] Loss : 0.7547 +2024-05-23 17:26:08,321 - INFO - Test loss : 0.7121 +2024-05-23 17:26:08,321 - INFO - +Epoch 26 +2024-05-23 17:26:28,725 - INFO - [20480/22599] Loss : 0.7521 +2024-05-23 17:26:29,741 - INFO - Test loss : 0.7117 +2024-05-23 17:26:29,742 - INFO - +Epoch 27 +2024-05-23 17:26:50,408 - INFO - [20480/22599] Loss : 0.7450 +2024-05-23 17:26:51,368 - INFO - Test loss : 0.7499 +2024-05-23 17:26:51,368 - INFO - +Epoch 28 +2024-05-23 17:27:11,664 - INFO - [20480/22599] Loss : 0.7411 +2024-05-23 17:27:12,627 - INFO - Test loss : 0.7460 +2024-05-23 17:27:12,627 - INFO - +Epoch 29 +2024-05-23 17:27:32,966 - INFO - [20480/22599] Loss : 0.7407 +2024-05-23 17:27:33,837 - INFO - Test loss : 0.7521 +2024-05-23 17:27:33,837 - INFO - Saving output model to ../models/space2vec_theory/model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 17:27:33,851 - INFO - Saving output model to ../models/space2vec_theory/model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 17:27:33,886 - INFO - +No prior +2024-05-23 17:27:33,909 - INFO - (24633,) +2024-05-23 17:27:34,586 - INFO - Split ID: 0 +2024-05-23 17:27:34,586 - INFO - Top 1 acc (%): 76.08 +2024-05-23 17:27:34,587 - INFO - Top 3 acc (%): 90.98 +2024-05-23 17:27:34,587 - INFO - Top 5 acc (%): 94.06 +2024-05-23 17:27:34,587 - INFO - Top 10 acc (%): 96.83 +2024-05-23 17:28:04,220 - INFO - Split ID: 0 +2024-05-23 17:28:04,220 - INFO - Top 1 acc (%): 81.65 +2024-05-23 17:28:04,220 - INFO - Top 3 acc (%): 93.4 +2024-05-23 17:28:04,220 - INFO - Top 5 acc (%): 95.71 +2024-05-23 17:28:04,221 - INFO - Top 10 acc (%): 97.71 +2024-05-23 17:28:04,221 - INFO - +Space2Vec-theory +2024-05-23 17:28:04,221 - INFO - Model : model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 17:28:04,237 - INFO - (1100,) +2024-05-23 17:28:04,283 - INFO - Split ID: 0 +2024-05-23 17:28:04,284 - INFO - Top 1 LocEnc acc (%): 2.82 +2024-05-23 17:28:04,284 - INFO - Top 3 LocEnc acc (%): 7.82 +2024-05-23 17:28:04,284 - INFO - Top 5 LocEnc acc (%): 11.55 +2024-05-23 17:28:04,284 - INFO - Top 10 LocEnc acc (%): 18.64 +2024-05-31 01:49:51,812 - INFO - +num_classes 555 +2024-05-31 01:49:51,812 - INFO - num train 22599 +2024-05-31 01:49:51,812 - INFO - num val 1100 +2024-05-31 01:49:51,812 - INFO - train loss full_loss +2024-05-31 01:49:51,812 - INFO - model name ../models/space2vec_theory/model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:49:51,812 - INFO - num users 5331 +2024-05-31 01:49:51,812 - INFO - meta data ebird_meta +2024-05-31 01:49:52,560 - INFO - +Only Space2Vec-theory +2024-05-31 01:49:52,560 - INFO - Model : model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:49:52,689 - INFO - Saving output model to ../models/space2vec_theory/model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:49:52,734 - INFO - Saving output model to ../models/space2vec_theory/model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:49:52,780 - INFO - +No prior +2024-05-31 01:49:52,812 - INFO - (24633,) +2024-05-31 01:49:54,367 - INFO - Save results to ../eval_results/eval_nabirds_ebird_meta_test_no_prior.csv +2024-05-31 01:49:54,368 - INFO - Split ID: 0 +2024-05-31 01:49:54,368 - INFO - Top 1 acc (%): 76.08 +2024-05-31 01:49:54,368 - INFO - Top 3 acc (%): 90.98 +2024-05-31 01:49:54,368 - INFO - Top 5 acc (%): 94.06 +2024-05-31 01:49:54,368 - INFO - Top 10 acc (%): 96.83 +2024-05-31 01:50:06,742 - INFO - Split ID: 0 +2024-05-31 01:50:06,742 - INFO - Top 1 hit (%): 81.65 +2024-05-31 01:50:06,742 - INFO - Top 3 hit (%): 93.4 +2024-05-31 01:50:06,743 - INFO - Top 5 hit (%): 95.71 +2024-05-31 01:50:06,743 - INFO - Top 10 hit (%): 97.71 +2024-05-31 01:50:06,749 - INFO - +Only Space2Vec-theory +2024-05-31 01:50:06,749 - INFO - Model : model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:50:06,788 - INFO - (1100,) +2024-05-31 01:50:06,832 - INFO - Split ID: 0 +2024-05-31 01:50:06,833 - INFO - Top 1 LocEnc acc (%): 2.82 +2024-05-31 01:50:06,833 - INFO - Top 3 LocEnc acc (%): 7.82 +2024-05-31 01:50:06,833 - INFO - Top 5 LocEnc acc (%): 11.55 +2024-05-31 01:50:06,833 - INFO - Top 10 LocEnc acc (%): 18.64 diff --git a/pre_trained_models/space2vec_theory/model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/space2vec_theory/model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..454b2629 Binary files /dev/null and b/pre_trained_models/space2vec_theory/model_nabirds_ebird_meta_Space2Vec-theory_inception_v3_0.0100_32_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/space2vec_theory/model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.log b/pre_trained_models/space2vec_theory/model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..bfc5089a --- /dev/null +++ b/pre_trained_models/space2vec_theory/model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.log @@ -0,0 +1,224 @@ +2024-05-23 20:08:30,812 - INFO - +num_classes 100 +2024-05-23 20:08:30,812 - INFO - num train 66739 +2024-05-23 20:08:30,812 - INFO - num val 4449 +2024-05-23 20:08:30,812 - INFO - train loss full_loss +2024-05-23 20:08:30,812 - INFO - model name ../models/space2vec_theory/model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 20:08:30,812 - INFO - num users 1 +2024-05-23 20:08:31,572 - INFO - +Epoch 0 +2024-05-23 20:08:37,801 - INFO - [65536/66739] Loss : 1.3735 +2024-05-23 20:08:38,139 - INFO - Test loss : 0.4356 +2024-05-23 20:08:38,139 - INFO - +Epoch 1 +2024-05-23 20:08:43,289 - INFO - [65536/66739] Loss : 1.1711 +2024-05-23 20:08:43,549 - INFO - Test loss : 0.4522 +2024-05-23 20:08:43,549 - INFO - +Epoch 2 +2024-05-23 20:08:49,041 - INFO - [65536/66739] Loss : 1.1240 +2024-05-23 20:08:49,412 - INFO - Test loss : 0.4382 +2024-05-23 20:08:49,412 - INFO - +Epoch 3 +2024-05-23 20:08:54,689 - INFO - [65536/66739] Loss : 1.0996 +2024-05-23 20:08:55,056 - INFO - Test loss : 0.4427 +2024-05-23 20:08:55,056 - INFO - +Epoch 4 +2024-05-23 20:09:00,514 - INFO - [65536/66739] Loss : 1.0861 +2024-05-23 20:09:00,856 - INFO - Test loss : 0.4538 +2024-05-23 20:09:00,856 - INFO - +Epoch 5 +2024-05-23 20:09:06,331 - INFO - [65536/66739] Loss : 1.0735 +2024-05-23 20:09:06,599 - INFO - Test loss : 0.4564 +2024-05-23 20:09:06,599 - INFO - +Epoch 6 +2024-05-23 20:09:11,978 - INFO - [65536/66739] Loss : 1.0660 +2024-05-23 20:09:12,243 - INFO - Test loss : 0.4761 +2024-05-23 20:09:12,243 - INFO - +Epoch 7 +2024-05-23 20:09:17,689 - INFO - [65536/66739] Loss : 1.0606 +2024-05-23 20:09:17,949 - INFO - Test loss : 0.4655 +2024-05-23 20:09:17,950 - INFO - +Epoch 8 +2024-05-23 20:09:23,296 - INFO - [65536/66739] Loss : 1.0541 +2024-05-23 20:09:23,629 - INFO - Test loss : 0.4654 +2024-05-23 20:09:23,629 - INFO - +Epoch 9 +2024-05-23 20:09:28,924 - INFO - [65536/66739] Loss : 1.0496 +2024-05-23 20:09:29,405 - INFO - Test loss : 0.4817 +2024-05-23 20:09:29,406 - INFO - +Epoch 10 +2024-05-23 20:09:34,681 - INFO - [65536/66739] Loss : 1.0430 +2024-05-23 20:09:35,046 - INFO - Test loss : 0.4689 +2024-05-23 20:09:35,090 - INFO - (4449,) +2024-05-23 20:09:35,114 - INFO - Split ID: 0 +2024-05-23 20:09:35,116 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 10.79 +2024-05-23 20:09:35,116 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 21.87 +2024-05-23 20:09:35,116 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 30.1 +2024-05-23 20:09:35,116 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 45.49 +2024-05-23 20:09:35,117 - INFO - +No prior +2024-05-23 20:09:35,119 - INFO - (17798,) +2024-05-23 20:09:35,217 - INFO - Split ID: 0 +2024-05-23 20:09:35,217 - INFO - Top 1 (Epoch 10)acc (%): 50.15 +2024-05-23 20:09:35,217 - INFO - Top 3 (Epoch 10)acc (%): 73.9 +2024-05-23 20:09:35,217 - INFO - Top 5 (Epoch 10)acc (%): 82.45 +2024-05-23 20:09:35,217 - INFO - Top 10 (Epoch 10)acc (%): 91.06 +2024-05-23 20:09:43,961 - INFO - Split ID: 0 +2024-05-23 20:09:43,961 - INFO - Top 1 (Epoch 10)acc (%): 51.13 +2024-05-23 20:09:43,961 - INFO - Top 3 (Epoch 10)acc (%): 75.29 +2024-05-23 20:09:43,961 - INFO - Top 5 (Epoch 10)acc (%): 83.75 +2024-05-23 20:09:43,961 - INFO - Top 10 (Epoch 10)acc (%): 92.06 +2024-05-23 20:09:43,961 - INFO - +Epoch 11 +2024-05-23 20:09:49,341 - INFO - [65536/66739] Loss : 1.0398 +2024-05-23 20:09:49,600 - INFO - Test loss : 0.4705 +2024-05-23 20:09:49,600 - INFO - +Epoch 12 +2024-05-23 20:09:54,859 - INFO - [65536/66739] Loss : 1.0386 +2024-05-23 20:09:55,122 - INFO - Test loss : 0.5050 +2024-05-23 20:09:55,122 - INFO - +Epoch 13 +2024-05-23 20:10:00,762 - INFO - [65536/66739] Loss : 1.0334 +2024-05-23 20:10:01,021 - INFO - Test loss : 0.5060 +2024-05-23 20:10:01,021 - INFO - +Epoch 14 +2024-05-23 20:10:06,465 - INFO - [65536/66739] Loss : 1.0324 +2024-05-23 20:10:06,733 - INFO - Test loss : 0.4919 +2024-05-23 20:10:06,734 - INFO - +Epoch 15 +2024-05-23 20:10:12,050 - INFO - [65536/66739] Loss : 1.0275 +2024-05-23 20:10:12,523 - INFO - Test loss : 0.5110 +2024-05-23 20:10:12,523 - INFO - +Epoch 16 +2024-05-23 20:10:17,825 - INFO - [65536/66739] Loss : 1.0264 +2024-05-23 20:10:18,085 - INFO - Test loss : 0.5174 +2024-05-23 20:10:18,085 - INFO - +Epoch 17 +2024-05-23 20:10:23,786 - INFO - [65536/66739] Loss : 1.0231 +2024-05-23 20:10:24,150 - INFO - Test loss : 0.5043 +2024-05-23 20:10:24,150 - INFO - +Epoch 18 +2024-05-23 20:10:29,260 - INFO - [65536/66739] Loss : 1.0214 +2024-05-23 20:10:29,526 - INFO - Test loss : 0.5042 +2024-05-23 20:10:29,526 - INFO - +Epoch 19 +2024-05-23 20:10:35,122 - INFO - [65536/66739] Loss : 1.0217 +2024-05-23 20:10:35,384 - INFO - Test loss : 0.5255 +2024-05-23 20:10:35,384 - INFO - +Epoch 20 +2024-05-23 20:10:41,025 - INFO - [65536/66739] Loss : 1.0162 +2024-05-23 20:10:41,283 - INFO - Test loss : 0.5148 +2024-05-23 20:10:41,317 - INFO - (4449,) +2024-05-23 20:10:41,339 - INFO - Split ID: 0 +2024-05-23 20:10:41,341 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 11.46 +2024-05-23 20:10:41,341 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 22.93 +2024-05-23 20:10:41,341 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 31.02 +2024-05-23 20:10:41,342 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 47.34 +2024-05-23 20:10:41,342 - INFO - +No prior +2024-05-23 20:10:41,344 - INFO - (17798,) +2024-05-23 20:10:41,441 - INFO - Split ID: 0 +2024-05-23 20:10:41,441 - INFO - Top 1 (Epoch 20)acc (%): 50.15 +2024-05-23 20:10:41,441 - INFO - Top 3 (Epoch 20)acc (%): 73.9 +2024-05-23 20:10:41,441 - INFO - Top 5 (Epoch 20)acc (%): 82.45 +2024-05-23 20:10:41,441 - INFO - Top 10 (Epoch 20)acc (%): 91.06 +2024-05-23 20:10:50,276 - INFO - Split ID: 0 +2024-05-23 20:10:50,276 - INFO - Top 1 (Epoch 20)acc (%): 51.09 +2024-05-23 20:10:50,276 - INFO - Top 3 (Epoch 20)acc (%): 75.36 +2024-05-23 20:10:50,276 - INFO - Top 5 (Epoch 20)acc (%): 83.95 +2024-05-23 20:10:50,276 - INFO - Top 10 (Epoch 20)acc (%): 92.23 +2024-05-23 20:10:50,277 - INFO - +Epoch 21 +2024-05-23 20:10:55,565 - INFO - [65536/66739] Loss : 1.0140 +2024-05-23 20:10:55,822 - INFO - Test loss : 0.5257 +2024-05-23 20:10:55,823 - INFO - +Epoch 22 +2024-05-23 20:11:01,237 - INFO - [65536/66739] Loss : 1.0135 +2024-05-23 20:11:01,601 - INFO - Test loss : 0.5216 +2024-05-23 20:11:01,601 - INFO - +Epoch 23 +2024-05-23 20:11:07,033 - INFO - [65536/66739] Loss : 1.0113 +2024-05-23 20:11:07,294 - INFO - Test loss : 0.5313 +2024-05-23 20:11:07,294 - INFO - +Epoch 24 +2024-05-23 20:11:12,653 - INFO - [65536/66739] Loss : 1.0064 +2024-05-23 20:11:12,913 - INFO - Test loss : 0.5348 +2024-05-23 20:11:12,913 - INFO - +Epoch 25 +2024-05-23 20:11:18,247 - INFO - [65536/66739] Loss : 1.0058 +2024-05-23 20:11:18,506 - INFO - Test loss : 0.5691 +2024-05-23 20:11:18,506 - INFO - +Epoch 26 +2024-05-23 20:11:23,840 - INFO - [65536/66739] Loss : 1.0045 +2024-05-23 20:11:24,296 - INFO - Test loss : 0.5317 +2024-05-23 20:11:24,296 - INFO - +Epoch 27 +2024-05-23 20:11:29,492 - INFO - [65536/66739] Loss : 1.0046 +2024-05-23 20:11:29,964 - INFO - Test loss : 0.5575 +2024-05-23 20:11:29,964 - INFO - +Epoch 28 +2024-05-23 20:11:35,712 - INFO - [65536/66739] Loss : 1.0025 +2024-05-23 20:11:35,973 - INFO - Test loss : 0.5494 +2024-05-23 20:11:35,973 - INFO - +Epoch 29 +2024-05-23 20:11:41,424 - INFO - [65536/66739] Loss : 1.0013 +2024-05-23 20:11:41,683 - INFO - Test loss : 0.5467 +2024-05-23 20:11:41,683 - INFO - Saving output model to ../models/space2vec_theory/model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 20:11:41,688 - INFO - Saving output model to ../models/space2vec_theory/model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 20:11:41,697 - INFO - +No prior +2024-05-23 20:11:41,699 - INFO - (17798,) +2024-05-23 20:11:41,797 - INFO - Split ID: 0 +2024-05-23 20:11:41,797 - INFO - Top 1 acc (%): 50.15 +2024-05-23 20:11:41,797 - INFO - Top 3 acc (%): 73.9 +2024-05-23 20:11:41,797 - INFO - Top 5 acc (%): 82.45 +2024-05-23 20:11:41,797 - INFO - Top 10 acc (%): 91.06 +2024-05-23 20:11:50,509 - INFO - Split ID: 0 +2024-05-23 20:11:50,509 - INFO - Top 1 acc (%): 51.24 +2024-05-23 20:11:50,509 - INFO - Top 3 acc (%): 75.44 +2024-05-23 20:11:50,509 - INFO - Top 5 acc (%): 83.94 +2024-05-23 20:11:50,509 - INFO - Top 10 acc (%): 92.18 +2024-05-23 20:11:50,510 - INFO - +Space2Vec-theory +2024-05-23 20:11:50,510 - INFO - Model : model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-23 20:11:50,547 - INFO - (4449,) +2024-05-23 20:11:50,570 - INFO - Split ID: 0 +2024-05-23 20:11:50,571 - INFO - Top 1 LocEnc acc (%): 11.8 +2024-05-23 20:11:50,572 - INFO - Top 3 LocEnc acc (%): 23.67 +2024-05-23 20:11:50,572 - INFO - Top 5 LocEnc acc (%): 32.23 +2024-05-23 20:11:50,572 - INFO - Top 10 LocEnc acc (%): 47.76 +2024-05-31 01:55:41,579 - INFO - +num_classes 100 +2024-05-31 01:55:41,579 - INFO - num train 66739 +2024-05-31 01:55:41,579 - INFO - num val 4449 +2024-05-31 01:55:41,579 - INFO - train loss full_loss +2024-05-31 01:55:41,579 - INFO - model name ../models/space2vec_theory/model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:55:41,579 - INFO - num users 1 +2024-05-31 01:55:42,321 - INFO - +Only Space2Vec-theory +2024-05-31 01:55:42,321 - INFO - Model : model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:55:42,376 - INFO - Saving output model to ../models/space2vec_theory/model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:55:42,397 - INFO - Saving output model to ../models/space2vec_theory/model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:55:42,408 - INFO - +No prior +2024-05-31 01:55:42,413 - INFO - (17798,) +2024-05-31 01:55:42,840 - INFO - Save results to ../eval_results/eval_yfcc__test_no_prior.csv +2024-05-31 01:55:42,840 - INFO - Split ID: 0 +2024-05-31 01:55:42,841 - INFO - Top 1 acc (%): 50.15 +2024-05-31 01:55:42,841 - INFO - Top 3 acc (%): 73.9 +2024-05-31 01:55:42,841 - INFO - Top 5 acc (%): 82.45 +2024-05-31 01:55:42,841 - INFO - Top 10 acc (%): 91.06 +2024-05-31 01:55:55,130 - INFO - Split ID: 0 +2024-05-31 01:55:55,130 - INFO - Top 1 hit (%): 51.24 +2024-05-31 01:55:55,130 - INFO - Top 3 hit (%): 75.44 +2024-05-31 01:55:55,130 - INFO - Top 5 hit (%): 83.94 +2024-05-31 01:55:55,130 - INFO - Top 10 hit (%): 92.18 +2024-05-31 01:55:55,136 - INFO - +Only Space2Vec-theory +2024-05-31 01:55:55,136 - INFO - Model : model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 01:55:55,224 - INFO - (4449,) +2024-05-31 01:55:55,249 - INFO - Split ID: 0 +2024-05-31 01:55:55,250 - INFO - Top 1 LocEnc acc (%): 11.8 +2024-05-31 01:55:55,250 - INFO - Top 3 LocEnc acc (%): 23.67 +2024-05-31 01:55:55,251 - INFO - Top 5 LocEnc acc (%): 32.23 +2024-05-31 01:55:55,251 - INFO - Top 10 LocEnc acc (%): 47.76 diff --git a/pre_trained_models/space2vec_theory/model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/space2vec_theory/model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..15a45275 Binary files /dev/null and b/pre_trained_models/space2vec_theory/model_yfcc_Space2Vec-theory_inception_v3_0.0100_64_0.0500000_360.000_1_256_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/sphere2vec_dfs/model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.log b/pre_trained_models/sphere2vec_dfs/model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..df079149 --- /dev/null +++ b/pre_trained_models/sphere2vec_dfs/model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.log @@ -0,0 +1,639 @@ +2024-05-30 09:42:27,060 - INFO - +num_classes 500 +2024-05-30 09:42:27,060 - INFO - num train 42490 +2024-05-30 09:42:27,060 - INFO - num val 980 +2024-05-30 09:42:27,060 - INFO - train loss full_loss +2024-05-30 09:42:27,060 - INFO - model name ../models/sphere2vec_dfs/model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-30 09:42:27,060 - INFO - num users 5763 +2024-05-30 09:42:27,060 - INFO - meta data ebird_meta +2024-05-30 09:42:27,920 - INFO - +Epoch 0 +2024-05-30 09:42:45,086 - INFO - [40960/42490] Loss : 2.0212 +2024-05-30 09:42:45,165 - INFO - Test loss : 0.6313 +2024-05-30 09:42:45,165 - INFO - +Epoch 1 +2024-05-30 09:43:01,624 - INFO - [40960/42490] Loss : 1.4357 +2024-05-30 09:43:02,102 - INFO - Test loss : 0.5213 +2024-05-30 09:43:02,102 - INFO - +Epoch 2 +2024-05-30 09:43:21,355 - INFO - [40960/42490] Loss : 1.3273 +2024-05-30 09:43:21,833 - INFO - Test loss : 0.4779 +2024-05-30 09:43:21,833 - INFO - +Epoch 3 +2024-05-30 09:43:39,460 - INFO - [40960/42490] Loss : 1.2333 +2024-05-30 09:43:39,933 - INFO - Test loss : 0.4224 +2024-05-30 09:43:39,933 - INFO - +Epoch 4 +2024-05-30 09:43:58,185 - INFO - [40960/42490] Loss : 1.1870 +2024-05-30 09:43:58,656 - INFO - Test loss : 0.4142 +2024-05-30 09:43:58,656 - INFO - +Epoch 5 +2024-05-30 09:44:15,808 - INFO - [40960/42490] Loss : 1.1523 +2024-05-30 09:44:16,284 - INFO - Test loss : 0.3983 +2024-05-30 09:44:16,285 - INFO - +Epoch 6 +2024-05-30 09:44:33,853 - INFO - [40960/42490] Loss : 1.1277 +2024-05-30 09:44:34,321 - INFO - Test loss : 0.4161 +2024-05-30 09:44:34,321 - INFO - +Epoch 7 +2024-05-30 09:44:53,146 - INFO - [40960/42490] Loss : 1.1080 +2024-05-30 09:44:53,624 - INFO - Test loss : 0.3978 +2024-05-30 09:44:53,624 - INFO - +Epoch 8 +2024-05-30 09:45:09,981 - INFO - [40960/42490] Loss : 1.0903 +2024-05-30 09:45:10,457 - INFO - Test loss : 0.3843 +2024-05-30 09:45:10,458 - INFO - +Epoch 9 +2024-05-30 09:45:27,620 - INFO - [40960/42490] Loss : 1.0710 +2024-05-30 09:45:28,104 - INFO - Test loss : 0.3856 +2024-05-30 09:45:28,104 - INFO - +Epoch 10 +2024-05-30 09:45:47,248 - INFO - [40960/42490] Loss : 1.0511 +2024-05-30 09:45:47,720 - INFO - Test loss : 0.3664 +2024-05-30 09:45:47,732 - INFO - (980,) +2024-05-30 09:45:47,769 - INFO - Split ID: 0 +2024-05-30 09:45:47,769 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 1.73 +2024-05-30 09:45:47,769 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 4.08 +2024-05-30 09:45:47,769 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 5.92 +2024-05-30 09:45:47,769 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 11.94 +2024-05-30 09:45:47,770 - INFO - +No prior +2024-05-30 09:45:47,771 - INFO - (2262,) +2024-05-30 09:45:47,824 - INFO - Split ID: 0 +2024-05-30 09:45:47,824 - INFO - Top 1 (Epoch 10)acc (%): 70.07 +2024-05-30 09:45:47,824 - INFO - Top 3 (Epoch 10)acc (%): 86.6 +2024-05-30 09:45:47,824 - INFO - Top 5 (Epoch 10)acc (%): 90.05 +2024-05-30 09:45:47,824 - INFO - Top 10 (Epoch 10)acc (%): 92.88 +2024-05-30 09:45:49,827 - INFO - Split ID: 0 +2024-05-30 09:45:49,827 - INFO - Top 1 (Epoch 10)acc (%): 75.6 +2024-05-30 09:45:49,827 - INFO - Top 3 (Epoch 10)acc (%): 89.39 +2024-05-30 09:45:49,827 - INFO - Top 5 (Epoch 10)acc (%): 92.26 +2024-05-30 09:45:49,827 - INFO - Top 10 (Epoch 10)acc (%): 94.83 +2024-05-30 09:45:49,827 - INFO - +Epoch 11 +2024-05-30 09:46:07,705 - INFO - [40960/42490] Loss : 1.0305 +2024-05-30 09:46:07,895 - INFO - Test loss : 0.3499 +2024-05-30 09:46:07,896 - INFO - +Epoch 12 +2024-05-30 09:46:25,372 - INFO - [40960/42490] Loss : 1.0125 +2024-05-30 09:46:25,574 - INFO - Test loss : 0.3417 +2024-05-30 09:46:25,574 - INFO - +Epoch 13 +2024-05-30 09:46:43,809 - INFO - [40960/42490] Loss : 0.9936 +2024-05-30 09:46:44,284 - INFO - Test loss : 0.3309 +2024-05-30 09:46:44,284 - INFO - +Epoch 14 +2024-05-30 09:47:02,529 - INFO - [40960/42490] Loss : 0.9753 +2024-05-30 09:47:03,007 - INFO - Test loss : 0.3134 +2024-05-30 09:47:03,007 - INFO - +Epoch 15 +2024-05-30 09:47:20,017 - INFO - [40960/42490] Loss : 0.9585 +2024-05-30 09:47:20,355 - INFO - Test loss : 0.3124 +2024-05-30 09:47:20,355 - INFO - +Epoch 16 +2024-05-30 09:47:40,212 - INFO - [40960/42490] Loss : 0.9439 +2024-05-30 09:47:40,685 - INFO - Test loss : 0.3107 +2024-05-30 09:47:40,686 - INFO - +Epoch 17 +2024-05-30 09:47:58,614 - INFO - [40960/42490] Loss : 0.9379 +2024-05-30 09:47:59,092 - INFO - Test loss : 0.3093 +2024-05-30 09:47:59,092 - INFO - +Epoch 18 +2024-05-30 09:48:17,414 - INFO - [40960/42490] Loss : 0.9286 +2024-05-30 09:48:17,892 - INFO - Test loss : 0.3141 +2024-05-30 09:48:17,892 - INFO - +Epoch 19 +2024-05-30 09:48:35,428 - INFO - [40960/42490] Loss : 0.9239 +2024-05-30 09:48:35,905 - INFO - Test loss : 0.3016 +2024-05-30 09:48:35,905 - INFO - +Epoch 20 +2024-05-30 09:48:52,905 - INFO - [40960/42490] Loss : 0.9147 +2024-05-30 09:48:53,304 - INFO - Test loss : 0.3105 +2024-05-30 09:48:53,309 - INFO - (980,) +2024-05-30 09:48:53,345 - INFO - Split ID: 0 +2024-05-30 09:48:53,345 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 2.35 +2024-05-30 09:48:53,346 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 7.04 +2024-05-30 09:48:53,346 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 11.33 +2024-05-30 09:48:53,346 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 20.1 +2024-05-30 09:48:53,346 - INFO - +No prior +2024-05-30 09:48:53,347 - INFO - (2262,) +2024-05-30 09:48:53,396 - INFO - Split ID: 0 +2024-05-30 09:48:53,397 - INFO - Top 1 (Epoch 20)acc (%): 70.07 +2024-05-30 09:48:53,397 - INFO - Top 3 (Epoch 20)acc (%): 86.6 +2024-05-30 09:48:53,397 - INFO - Top 5 (Epoch 20)acc (%): 90.05 +2024-05-30 09:48:53,397 - INFO - Top 10 (Epoch 20)acc (%): 92.88 +2024-05-30 09:48:55,350 - INFO - Split ID: 0 +2024-05-30 09:48:55,350 - INFO - Top 1 (Epoch 20)acc (%): 77.37 +2024-05-30 09:48:55,351 - INFO - Top 3 (Epoch 20)acc (%): 90.27 +2024-05-30 09:48:55,351 - INFO - Top 5 (Epoch 20)acc (%): 92.88 +2024-05-30 09:48:55,351 - INFO - Top 10 (Epoch 20)acc (%): 95.31 +2024-05-30 09:48:55,351 - INFO - +Epoch 21 +2024-05-30 09:49:12,450 - INFO - [40960/42490] Loss : 0.9106 +2024-05-30 09:49:12,939 - INFO - Test loss : 0.3063 +2024-05-30 09:49:12,939 - INFO - +Epoch 22 +2024-05-30 09:49:32,250 - INFO - [40960/42490] Loss : 0.9059 +2024-05-30 09:49:32,392 - INFO - Test loss : 0.3114 +2024-05-30 09:49:32,392 - INFO - +Epoch 23 +2024-05-30 09:49:50,686 - INFO - [40960/42490] Loss : 0.9013 +2024-05-30 09:49:50,879 - INFO - Test loss : 0.2955 +2024-05-30 09:49:50,879 - INFO - +Epoch 24 +2024-05-30 09:50:07,763 - INFO - [40960/42490] Loss : 0.8951 +2024-05-30 09:50:08,079 - INFO - Test loss : 0.3268 +2024-05-30 09:50:08,079 - INFO - +Epoch 25 +2024-05-30 09:50:24,550 - INFO - [40960/42490] Loss : 0.8914 +2024-05-30 09:50:25,025 - INFO - Test loss : 0.3002 +2024-05-30 09:50:25,025 - INFO - +Epoch 26 +2024-05-30 09:50:41,425 - INFO - [40960/42490] Loss : 0.8880 +2024-05-30 09:50:41,894 - INFO - Test loss : 0.2971 +2024-05-30 09:50:41,894 - INFO - +Epoch 27 +2024-05-30 09:51:00,470 - INFO - [40960/42490] Loss : 0.8820 +2024-05-30 09:51:00,947 - INFO - Test loss : 0.3045 +2024-05-30 09:51:00,948 - INFO - +Epoch 28 +2024-05-30 09:51:18,600 - INFO - [40960/42490] Loss : 0.8789 +2024-05-30 09:51:19,078 - INFO - Test loss : 0.3029 +2024-05-30 09:51:19,078 - INFO - +Epoch 29 +2024-05-30 09:51:37,453 - INFO - [40960/42490] Loss : 0.8772 +2024-05-30 09:51:37,931 - INFO - Test loss : 0.3032 +2024-05-30 09:51:37,931 - INFO - +Epoch 30 +2024-05-30 09:51:57,017 - INFO - [40960/42490] Loss : 0.8728 +2024-05-30 09:51:57,491 - INFO - Test loss : 0.3011 +2024-05-30 09:51:57,497 - INFO - (980,) +2024-05-30 09:51:57,531 - INFO - Split ID: 0 +2024-05-30 09:51:57,532 - INFO - Top 1 LocEnc (Epoch 30)acc (%): 3.57 +2024-05-30 09:51:57,532 - INFO - Top 3 LocEnc (Epoch 30)acc (%): 8.57 +2024-05-30 09:51:57,532 - INFO - Top 5 LocEnc (Epoch 30)acc (%): 12.86 +2024-05-30 09:51:57,532 - INFO - Top 10 LocEnc (Epoch 30)acc (%): 21.84 +2024-05-30 09:51:57,532 - INFO - +No prior +2024-05-30 09:51:57,533 - INFO - (2262,) +2024-05-30 09:51:57,583 - INFO - Split ID: 0 +2024-05-30 09:51:57,583 - INFO - Top 1 (Epoch 30)acc (%): 70.07 +2024-05-30 09:51:57,583 - INFO - Top 3 (Epoch 30)acc (%): 86.6 +2024-05-30 09:51:57,583 - INFO - Top 5 (Epoch 30)acc (%): 90.05 +2024-05-30 09:51:57,583 - INFO - Top 10 (Epoch 30)acc (%): 92.88 +2024-05-30 09:51:59,584 - INFO - Split ID: 0 +2024-05-30 09:51:59,585 - INFO - Top 1 (Epoch 30)acc (%): 77.81 +2024-05-30 09:51:59,585 - INFO - Top 3 (Epoch 30)acc (%): 90.72 +2024-05-30 09:51:59,585 - INFO - Top 5 (Epoch 30)acc (%): 92.79 +2024-05-30 09:51:59,585 - INFO - Top 10 (Epoch 30)acc (%): 95.49 +2024-05-30 09:51:59,585 - INFO - +Epoch 31 +2024-05-30 09:52:17,309 - INFO - [40960/42490] Loss : 0.8724 +2024-05-30 09:52:17,750 - INFO - Test loss : 0.2902 +2024-05-30 09:52:17,750 - INFO - +Epoch 32 +2024-05-30 09:52:36,071 - INFO - [40960/42490] Loss : 0.8702 +2024-05-30 09:52:36,548 - INFO - Test loss : 0.3013 +2024-05-30 09:52:36,548 - INFO - +Epoch 33 +2024-05-30 09:52:54,503 - INFO - [40960/42490] Loss : 0.8663 +2024-05-30 09:52:54,981 - INFO - Test loss : 0.3036 +2024-05-30 09:52:54,981 - INFO - +Epoch 34 +2024-05-30 09:53:12,352 - INFO - [40960/42490] Loss : 0.8638 +2024-05-30 09:53:12,823 - INFO - Test loss : 0.2981 +2024-05-30 09:53:12,823 - INFO - +Epoch 35 +2024-05-30 09:53:30,635 - INFO - [40960/42490] Loss : 0.8611 +2024-05-30 09:53:31,076 - INFO - Test loss : 0.3036 +2024-05-30 09:53:31,076 - INFO - +Epoch 36 +2024-05-30 09:53:48,147 - INFO - [40960/42490] Loss : 0.8600 +2024-05-30 09:53:48,619 - INFO - Test loss : 0.2970 +2024-05-30 09:53:48,619 - INFO - +Epoch 37 +2024-05-30 09:54:06,467 - INFO - [40960/42490] Loss : 0.8580 +2024-05-30 09:54:06,953 - INFO - Test loss : 0.2944 +2024-05-30 09:54:06,953 - INFO - +Epoch 38 +2024-05-30 09:54:25,910 - INFO - [40960/42490] Loss : 0.8578 +2024-05-30 09:54:26,386 - INFO - Test loss : 0.3065 +2024-05-30 09:54:26,387 - INFO - +Epoch 39 +2024-05-30 09:54:42,584 - INFO - [40960/42490] Loss : 0.8558 +2024-05-30 09:54:42,914 - INFO - Test loss : 0.3093 +2024-05-30 09:54:42,914 - INFO - +Epoch 40 +2024-05-30 09:55:00,185 - INFO - [40960/42490] Loss : 0.8527 +2024-05-30 09:55:00,439 - INFO - Test loss : 0.3010 +2024-05-30 09:55:00,443 - INFO - (980,) +2024-05-30 09:55:00,478 - INFO - Split ID: 0 +2024-05-30 09:55:00,479 - INFO - Top 1 LocEnc (Epoch 40)acc (%): 3.57 +2024-05-30 09:55:00,479 - INFO - Top 3 LocEnc (Epoch 40)acc (%): 9.49 +2024-05-30 09:55:00,479 - INFO - Top 5 LocEnc (Epoch 40)acc (%): 13.98 +2024-05-30 09:55:00,479 - INFO - Top 10 LocEnc (Epoch 40)acc (%): 21.22 +2024-05-30 09:55:00,480 - INFO - +No prior +2024-05-30 09:55:00,481 - INFO - (2262,) +2024-05-30 09:55:00,531 - INFO - Split ID: 0 +2024-05-30 09:55:00,531 - INFO - Top 1 (Epoch 40)acc (%): 70.07 +2024-05-30 09:55:00,531 - INFO - Top 3 (Epoch 40)acc (%): 86.6 +2024-05-30 09:55:00,531 - INFO - Top 5 (Epoch 40)acc (%): 90.05 +2024-05-30 09:55:00,531 - INFO - Top 10 (Epoch 40)acc (%): 92.88 +2024-05-30 09:55:02,497 - INFO - Split ID: 0 +2024-05-30 09:55:02,498 - INFO - Top 1 (Epoch 40)acc (%): 78.12 +2024-05-30 09:55:02,498 - INFO - Top 3 (Epoch 40)acc (%): 90.72 +2024-05-30 09:55:02,498 - INFO - Top 5 (Epoch 40)acc (%): 92.97 +2024-05-30 09:55:02,498 - INFO - Top 10 (Epoch 40)acc (%): 95.53 +2024-05-30 09:55:02,498 - INFO - +Epoch 41 +2024-05-30 09:55:20,592 - INFO - [40960/42490] Loss : 0.8504 +2024-05-30 09:55:20,648 - INFO - Test loss : 0.2932 +2024-05-30 09:55:20,648 - INFO - +Epoch 42 +2024-05-30 09:55:38,785 - INFO - [40960/42490] Loss : 0.8517 +2024-05-30 09:55:39,264 - INFO - Test loss : 0.3010 +2024-05-30 09:55:39,264 - INFO - +Epoch 43 +2024-05-30 09:55:55,604 - INFO - [40960/42490] Loss : 0.8474 +2024-05-30 09:55:56,073 - INFO - Test loss : 0.3009 +2024-05-30 09:55:56,073 - INFO - +Epoch 44 +2024-05-30 09:56:13,847 - INFO - [40960/42490] Loss : 0.8485 +2024-05-30 09:56:14,324 - INFO - Test loss : 0.3058 +2024-05-30 09:56:14,324 - INFO - +Epoch 45 +2024-05-30 09:56:32,533 - INFO - [40960/42490] Loss : 0.8459 +2024-05-30 09:56:32,580 - INFO - Test loss : 0.2997 +2024-05-30 09:56:32,580 - INFO - +Epoch 46 +2024-05-30 09:56:51,359 - INFO - [40960/42490] Loss : 0.8454 +2024-05-30 09:56:51,406 - INFO - Test loss : 0.2869 +2024-05-30 09:56:51,406 - INFO - +Epoch 47 +2024-05-30 09:57:09,396 - INFO - [40960/42490] Loss : 0.8454 +2024-05-30 09:57:09,780 - INFO - Test loss : 0.3131 +2024-05-30 09:57:09,781 - INFO - +Epoch 48 +2024-05-30 09:57:27,729 - INFO - [40960/42490] Loss : 0.8410 +2024-05-30 09:57:28,201 - INFO - Test loss : 0.3061 +2024-05-30 09:57:28,201 - INFO - +Epoch 49 +2024-05-30 09:57:46,153 - INFO - [40960/42490] Loss : 0.8398 +2024-05-30 09:57:46,514 - INFO - Test loss : 0.3052 +2024-05-30 09:57:46,514 - INFO - +Epoch 50 +2024-05-30 09:58:02,837 - INFO - [40960/42490] Loss : 0.8384 +2024-05-30 09:58:02,916 - INFO - Test loss : 0.2970 +2024-05-30 09:58:02,921 - INFO - (980,) +2024-05-30 09:58:02,956 - INFO - Split ID: 0 +2024-05-30 09:58:02,957 - INFO - Top 1 LocEnc (Epoch 50)acc (%): 3.67 +2024-05-30 09:58:02,957 - INFO - Top 3 LocEnc (Epoch 50)acc (%): 8.67 +2024-05-30 09:58:02,957 - INFO - Top 5 LocEnc (Epoch 50)acc (%): 12.96 +2024-05-30 09:58:02,957 - INFO - Top 10 LocEnc (Epoch 50)acc (%): 22.65 +2024-05-30 09:58:02,958 - INFO - +No prior +2024-05-30 09:58:02,958 - INFO - (2262,) +2024-05-30 09:58:03,014 - INFO - Split ID: 0 +2024-05-30 09:58:03,015 - INFO - Top 1 (Epoch 50)acc (%): 70.07 +2024-05-30 09:58:03,015 - INFO - Top 3 (Epoch 50)acc (%): 86.6 +2024-05-30 09:58:03,015 - INFO - Top 5 (Epoch 50)acc (%): 90.05 +2024-05-30 09:58:03,015 - INFO - Top 10 (Epoch 50)acc (%): 92.88 +2024-05-30 09:58:05,021 - INFO - Split ID: 0 +2024-05-30 09:58:05,021 - INFO - Top 1 (Epoch 50)acc (%): 78.16 +2024-05-30 09:58:05,021 - INFO - Top 3 (Epoch 50)acc (%): 90.67 +2024-05-30 09:58:05,021 - INFO - Top 5 (Epoch 50)acc (%): 93.02 +2024-05-30 09:58:05,021 - INFO - Top 10 (Epoch 50)acc (%): 95.62 +2024-05-30 09:58:05,021 - INFO - +Epoch 51 +2024-05-30 09:58:21,539 - INFO - [40960/42490] Loss : 0.8375 +2024-05-30 09:58:22,012 - INFO - Test loss : 0.3068 +2024-05-30 09:58:22,012 - INFO - +Epoch 52 +2024-05-30 09:58:39,122 - INFO - [40960/42490] Loss : 0.8384 +2024-05-30 09:58:39,599 - INFO - Test loss : 0.3121 +2024-05-30 09:58:39,600 - INFO - +Epoch 53 +2024-05-30 09:58:58,738 - INFO - [40960/42490] Loss : 0.8346 +2024-05-30 09:58:59,210 - INFO - Test loss : 0.3078 +2024-05-30 09:58:59,210 - INFO - +Epoch 54 +2024-05-30 09:59:14,393 - INFO - [40960/42490] Loss : 0.8339 +2024-05-30 09:59:14,450 - INFO - Test loss : 0.3099 +2024-05-30 09:59:14,450 - INFO - +Epoch 55 +2024-05-30 09:59:16,910 - INFO - [40960/42490] Loss : 0.8343 +2024-05-30 09:59:16,957 - INFO - Test loss : 0.3074 +2024-05-30 09:59:16,957 - INFO - +Epoch 56 +2024-05-30 09:59:19,377 - INFO - [40960/42490] Loss : 0.8322 +2024-05-30 09:59:19,424 - INFO - Test loss : 0.3154 +2024-05-30 09:59:19,424 - INFO - +Epoch 57 +2024-05-30 09:59:21,849 - INFO - [40960/42490] Loss : 0.8314 +2024-05-30 09:59:21,896 - INFO - Test loss : 0.3169 +2024-05-30 09:59:21,896 - INFO - +Epoch 58 +2024-05-30 09:59:24,321 - INFO - [40960/42490] Loss : 0.8314 +2024-05-30 09:59:24,368 - INFO - Test loss : 0.3139 +2024-05-30 09:59:24,368 - INFO - +Epoch 59 +2024-05-30 09:59:26,789 - INFO - [40960/42490] Loss : 0.8287 +2024-05-30 09:59:26,836 - INFO - Test loss : 0.3154 +2024-05-30 09:59:26,836 - INFO - +Epoch 60 +2024-05-30 09:59:29,267 - INFO - [40960/42490] Loss : 0.8283 +2024-05-30 09:59:29,315 - INFO - Test loss : 0.3184 +2024-05-30 09:59:29,320 - INFO - (980,) +2024-05-30 09:59:29,356 - INFO - Split ID: 0 +2024-05-30 09:59:29,356 - INFO - Top 1 LocEnc (Epoch 60)acc (%): 4.49 +2024-05-30 09:59:29,357 - INFO - Top 3 LocEnc (Epoch 60)acc (%): 9.69 +2024-05-30 09:59:29,357 - INFO - Top 5 LocEnc (Epoch 60)acc (%): 14.39 +2024-05-30 09:59:29,357 - INFO - Top 10 LocEnc (Epoch 60)acc (%): 21.22 +2024-05-30 09:59:29,357 - INFO - +No prior +2024-05-30 09:59:29,358 - INFO - (2262,) +2024-05-30 09:59:29,409 - INFO - Split ID: 0 +2024-05-30 09:59:29,409 - INFO - Top 1 (Epoch 60)acc (%): 70.07 +2024-05-30 09:59:29,409 - INFO - Top 3 (Epoch 60)acc (%): 86.6 +2024-05-30 09:59:29,409 - INFO - Top 5 (Epoch 60)acc (%): 90.05 +2024-05-30 09:59:29,409 - INFO - Top 10 (Epoch 60)acc (%): 92.88 +2024-05-30 09:59:30,758 - INFO - Split ID: 0 +2024-05-30 09:59:30,758 - INFO - Top 1 (Epoch 60)acc (%): 78.34 +2024-05-30 09:59:30,758 - INFO - Top 3 (Epoch 60)acc (%): 90.94 +2024-05-30 09:59:30,758 - INFO - Top 5 (Epoch 60)acc (%): 92.88 +2024-05-30 09:59:30,758 - INFO - Top 10 (Epoch 60)acc (%): 95.58 +2024-05-30 09:59:30,758 - INFO - +Epoch 61 +2024-05-30 09:59:33,186 - INFO - [40960/42490] Loss : 0.8295 +2024-05-30 09:59:33,233 - INFO - Test loss : 0.3185 +2024-05-30 09:59:33,233 - INFO - +Epoch 62 +2024-05-30 09:59:35,670 - INFO - [40960/42490] Loss : 0.8273 +2024-05-30 09:59:35,717 - INFO - Test loss : 0.3107 +2024-05-30 09:59:35,717 - INFO - +Epoch 63 +2024-05-30 09:59:38,146 - INFO - [40960/42490] Loss : 0.8255 +2024-05-30 09:59:38,193 - INFO - Test loss : 0.3066 +2024-05-30 09:59:38,193 - INFO - +Epoch 64 +2024-05-30 09:59:40,647 - INFO - [40960/42490] Loss : 0.8262 +2024-05-30 09:59:40,694 - INFO - Test loss : 0.3071 +2024-05-30 09:59:40,695 - INFO - +Epoch 65 +2024-05-30 09:59:43,167 - INFO - [40960/42490] Loss : 0.8233 +2024-05-30 09:59:43,215 - INFO - Test loss : 0.3104 +2024-05-30 09:59:43,215 - INFO - +Epoch 66 +2024-05-30 09:59:45,679 - INFO - [40960/42490] Loss : 0.8250 +2024-05-30 09:59:45,727 - INFO - Test loss : 0.3081 +2024-05-30 09:59:45,727 - INFO - +Epoch 67 +2024-05-30 09:59:48,204 - INFO - [40960/42490] Loss : 0.8219 +2024-05-30 09:59:48,251 - INFO - Test loss : 0.3194 +2024-05-30 09:59:48,252 - INFO - +Epoch 68 +2024-05-30 09:59:50,712 - INFO - [40960/42490] Loss : 0.8226 +2024-05-30 09:59:50,761 - INFO - Test loss : 0.2966 +2024-05-30 09:59:50,761 - INFO - +Epoch 69 +2024-05-30 09:59:53,229 - INFO - [40960/42490] Loss : 0.8219 +2024-05-30 09:59:53,276 - INFO - Test loss : 0.3077 +2024-05-30 09:59:53,276 - INFO - +Epoch 70 +2024-05-30 09:59:55,759 - INFO - [40960/42490] Loss : 0.8214 +2024-05-30 09:59:55,806 - INFO - Test loss : 0.3064 +2024-05-30 09:59:55,810 - INFO - (980,) +2024-05-30 09:59:55,846 - INFO - Split ID: 0 +2024-05-30 09:59:55,847 - INFO - Top 1 LocEnc (Epoch 70)acc (%): 3.88 +2024-05-30 09:59:55,847 - INFO - Top 3 LocEnc (Epoch 70)acc (%): 9.8 +2024-05-30 09:59:55,847 - INFO - Top 5 LocEnc (Epoch 70)acc (%): 13.88 +2024-05-30 09:59:55,847 - INFO - Top 10 LocEnc (Epoch 70)acc (%): 21.33 +2024-05-30 09:59:55,847 - INFO - +No prior +2024-05-30 09:59:55,848 - INFO - (2262,) +2024-05-30 09:59:55,898 - INFO - Split ID: 0 +2024-05-30 09:59:55,898 - INFO - Top 1 (Epoch 70)acc (%): 70.07 +2024-05-30 09:59:55,898 - INFO - Top 3 (Epoch 70)acc (%): 86.6 +2024-05-30 09:59:55,898 - INFO - Top 5 (Epoch 70)acc (%): 90.05 +2024-05-30 09:59:55,898 - INFO - Top 10 (Epoch 70)acc (%): 92.88 +2024-05-30 09:59:57,253 - INFO - Split ID: 0 +2024-05-30 09:59:57,253 - INFO - Top 1 (Epoch 70)acc (%): 78.51 +2024-05-30 09:59:57,253 - INFO - Top 3 (Epoch 70)acc (%): 90.89 +2024-05-30 09:59:57,253 - INFO - Top 5 (Epoch 70)acc (%): 92.97 +2024-05-30 09:59:57,253 - INFO - Top 10 (Epoch 70)acc (%): 95.58 +2024-05-30 09:59:57,253 - INFO - +Epoch 71 +2024-05-30 09:59:59,683 - INFO - [40960/42490] Loss : 0.8186 +2024-05-30 09:59:59,730 - INFO - Test loss : 0.3178 +2024-05-30 09:59:59,731 - INFO - +Epoch 72 +2024-05-30 10:00:02,213 - INFO - [40960/42490] Loss : 0.8191 +2024-05-30 10:00:02,260 - INFO - Test loss : 0.2958 +2024-05-30 10:00:02,260 - INFO - +Epoch 73 +2024-05-30 10:00:04,688 - INFO - [40960/42490] Loss : 0.8191 +2024-05-30 10:00:04,736 - INFO - Test loss : 0.3110 +2024-05-30 10:00:04,736 - INFO - +Epoch 74 +2024-05-30 10:00:07,172 - INFO - [40960/42490] Loss : 0.8186 +2024-05-30 10:00:07,219 - INFO - Test loss : 0.3149 +2024-05-30 10:00:07,219 - INFO - +Epoch 75 +2024-05-30 10:00:09,694 - INFO - [40960/42490] Loss : 0.8171 +2024-05-30 10:00:09,741 - INFO - Test loss : 0.3120 +2024-05-30 10:00:09,741 - INFO - +Epoch 76 +2024-05-30 10:00:12,192 - INFO - [40960/42490] Loss : 0.8171 +2024-05-30 10:00:12,239 - INFO - Test loss : 0.3063 +2024-05-30 10:00:12,239 - INFO - +Epoch 77 +2024-05-30 10:00:14,669 - INFO - [40960/42490] Loss : 0.8176 +2024-05-30 10:00:14,716 - INFO - Test loss : 0.3075 +2024-05-30 10:00:14,716 - INFO - +Epoch 78 +2024-05-30 10:00:17,161 - INFO - [40960/42490] Loss : 0.8151 +2024-05-30 10:00:17,208 - INFO - Test loss : 0.3074 +2024-05-30 10:00:17,208 - INFO - +Epoch 79 +2024-05-30 10:00:19,637 - INFO - [40960/42490] Loss : 0.8151 +2024-05-30 10:00:19,694 - INFO - Test loss : 0.3223 +2024-05-30 10:00:19,694 - INFO - +Epoch 80 +2024-05-30 10:00:25,369 - INFO - [40960/42490] Loss : 0.8151 +2024-05-30 10:00:25,560 - INFO - Test loss : 0.3187 +2024-05-30 10:00:25,565 - INFO - (980,) +2024-05-30 10:00:25,601 - INFO - Split ID: 0 +2024-05-30 10:00:25,601 - INFO - Top 1 LocEnc (Epoch 80)acc (%): 4.29 +2024-05-30 10:00:25,602 - INFO - Top 3 LocEnc (Epoch 80)acc (%): 9.69 +2024-05-30 10:00:25,602 - INFO - Top 5 LocEnc (Epoch 80)acc (%): 13.67 +2024-05-30 10:00:25,602 - INFO - Top 10 LocEnc (Epoch 80)acc (%): 21.94 +2024-05-30 10:00:25,602 - INFO - +No prior +2024-05-30 10:00:25,603 - INFO - (2262,) +2024-05-30 10:00:25,653 - INFO - Split ID: 0 +2024-05-30 10:00:25,653 - INFO - Top 1 (Epoch 80)acc (%): 70.07 +2024-05-30 10:00:25,653 - INFO - Top 3 (Epoch 80)acc (%): 86.6 +2024-05-30 10:00:25,653 - INFO - Top 5 (Epoch 80)acc (%): 90.05 +2024-05-30 10:00:25,654 - INFO - Top 10 (Epoch 80)acc (%): 92.88 +2024-05-30 10:00:27,710 - INFO - Split ID: 0 +2024-05-30 10:00:27,710 - INFO - Top 1 (Epoch 80)acc (%): 78.69 +2024-05-30 10:00:27,710 - INFO - Top 3 (Epoch 80)acc (%): 90.98 +2024-05-30 10:00:27,710 - INFO - Top 5 (Epoch 80)acc (%): 93.1 +2024-05-30 10:00:27,710 - INFO - Top 10 (Epoch 80)acc (%): 95.62 +2024-05-30 10:00:27,710 - INFO - +Epoch 81 +2024-05-30 10:00:36,224 - INFO - [40960/42490] Loss : 0.8143 +2024-05-30 10:00:36,406 - INFO - Test loss : 0.3151 +2024-05-30 10:00:36,406 - INFO - +Epoch 82 +2024-05-30 10:00:44,937 - INFO - [40960/42490] Loss : 0.8136 +2024-05-30 10:00:45,126 - INFO - Test loss : 0.3277 +2024-05-30 10:00:45,126 - INFO - +Epoch 83 +2024-05-30 10:00:53,695 - INFO - [40960/42490] Loss : 0.8156 +2024-05-30 10:00:53,890 - INFO - Test loss : 0.3103 +2024-05-30 10:00:53,890 - INFO - +Epoch 84 +2024-05-30 10:01:02,456 - INFO - [40960/42490] Loss : 0.8136 +2024-05-30 10:01:02,645 - INFO - Test loss : 0.3187 +2024-05-30 10:01:02,645 - INFO - +Epoch 85 +2024-05-30 10:01:11,222 - INFO - [40960/42490] Loss : 0.8113 +2024-05-30 10:01:11,414 - INFO - Test loss : 0.3206 +2024-05-30 10:01:11,414 - INFO - +Epoch 86 +2024-05-30 10:01:19,910 - INFO - [40960/42490] Loss : 0.8103 +2024-05-30 10:01:20,100 - INFO - Test loss : 0.3206 +2024-05-30 10:01:20,100 - INFO - +Epoch 87 +2024-05-30 10:01:28,661 - INFO - [40960/42490] Loss : 0.8101 +2024-05-30 10:01:28,854 - INFO - Test loss : 0.3204 +2024-05-30 10:01:28,854 - INFO - +Epoch 88 +2024-05-30 10:01:41,386 - INFO - [40960/42490] Loss : 0.8098 +2024-05-30 10:01:41,855 - INFO - Test loss : 0.3255 +2024-05-30 10:01:41,855 - INFO - +Epoch 89 +2024-05-30 10:01:59,205 - INFO - [40960/42490] Loss : 0.8080 +2024-05-30 10:01:59,678 - INFO - Test loss : 0.3189 +2024-05-30 10:01:59,678 - INFO - +Epoch 90 +2024-05-30 10:02:17,423 - INFO - [40960/42490] Loss : 0.8091 +2024-05-30 10:02:17,902 - INFO - Test loss : 0.3236 +2024-05-30 10:02:17,909 - INFO - (980,) +2024-05-30 10:02:17,946 - INFO - Split ID: 0 +2024-05-30 10:02:17,946 - INFO - Top 1 LocEnc (Epoch 90)acc (%): 3.67 +2024-05-30 10:02:17,946 - INFO - Top 3 LocEnc (Epoch 90)acc (%): 8.78 +2024-05-30 10:02:17,947 - INFO - Top 5 LocEnc (Epoch 90)acc (%): 12.55 +2024-05-30 10:02:17,947 - INFO - Top 10 LocEnc (Epoch 90)acc (%): 19.9 +2024-05-30 10:02:17,947 - INFO - +No prior +2024-05-30 10:02:17,948 - INFO - (2262,) +2024-05-30 10:02:17,999 - INFO - Split ID: 0 +2024-05-30 10:02:17,999 - INFO - Top 1 (Epoch 90)acc (%): 70.07 +2024-05-30 10:02:17,999 - INFO - Top 3 (Epoch 90)acc (%): 86.6 +2024-05-30 10:02:18,000 - INFO - Top 5 (Epoch 90)acc (%): 90.05 +2024-05-30 10:02:18,000 - INFO - Top 10 (Epoch 90)acc (%): 92.88 +2024-05-30 10:02:19,951 - INFO - Split ID: 0 +2024-05-30 10:02:19,951 - INFO - Top 1 (Epoch 90)acc (%): 78.34 +2024-05-30 10:02:19,952 - INFO - Top 3 (Epoch 90)acc (%): 91.03 +2024-05-30 10:02:19,952 - INFO - Top 5 (Epoch 90)acc (%): 93.06 +2024-05-30 10:02:19,952 - INFO - Top 10 (Epoch 90)acc (%): 95.62 +2024-05-30 10:02:19,952 - INFO - +Epoch 91 +2024-05-30 10:02:36,856 - INFO - [40960/42490] Loss : 0.8102 +2024-05-30 10:02:37,331 - INFO - Test loss : 0.3148 +2024-05-30 10:02:37,331 - INFO - +Epoch 92 +2024-05-30 10:02:54,788 - INFO - [40960/42490] Loss : 0.8083 +2024-05-30 10:02:55,261 - INFO - Test loss : 0.3154 +2024-05-30 10:02:55,261 - INFO - +Epoch 93 +2024-05-30 10:03:13,180 - INFO - [40960/42490] Loss : 0.8072 +2024-05-30 10:03:13,655 - INFO - Test loss : 0.3198 +2024-05-30 10:03:13,655 - INFO - +Epoch 94 +2024-05-30 10:03:31,453 - INFO - [40960/42490] Loss : 0.8060 +2024-05-30 10:03:31,930 - INFO - Test loss : 0.3295 +2024-05-30 10:03:31,930 - INFO - +Epoch 95 +2024-05-30 10:03:51,097 - INFO - [40960/42490] Loss : 0.8070 +2024-05-30 10:03:51,575 - INFO - Test loss : 0.3224 +2024-05-30 10:03:51,575 - INFO - +Epoch 96 +2024-05-30 10:04:10,769 - INFO - [40960/42490] Loss : 0.8070 +2024-05-30 10:04:11,241 - INFO - Test loss : 0.3229 +2024-05-30 10:04:11,241 - INFO - +Epoch 97 +2024-05-30 10:04:29,904 - INFO - [40960/42490] Loss : 0.8061 +2024-05-30 10:04:30,382 - INFO - Test loss : 0.3124 +2024-05-30 10:04:30,382 - INFO - +Epoch 98 +2024-05-30 10:04:49,243 - INFO - [40960/42490] Loss : 0.8054 +2024-05-30 10:04:49,720 - INFO - Test loss : 0.3345 +2024-05-30 10:04:49,720 - INFO - +Epoch 99 +2024-05-30 10:05:08,173 - INFO - [40960/42490] Loss : 0.8067 +2024-05-30 10:05:08,643 - INFO - Test loss : 0.3203 +2024-05-30 10:05:08,643 - INFO - Saving output model to ../models/sphere2vec_dfs/model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-30 10:05:08,670 - INFO - Saving output model to ../models/sphere2vec_dfs/model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-30 10:05:08,756 - INFO - +No prior +2024-05-30 10:05:08,757 - INFO - (2262,) +2024-05-30 10:05:08,809 - INFO - Split ID: 0 +2024-05-30 10:05:08,809 - INFO - Top 1 acc (%): 70.07 +2024-05-30 10:05:08,809 - INFO - Top 3 acc (%): 86.6 +2024-05-30 10:05:08,809 - INFO - Top 5 acc (%): 90.05 +2024-05-30 10:05:08,809 - INFO - Top 10 acc (%): 92.88 +2024-05-30 10:05:10,810 - INFO - Split ID: 0 +2024-05-30 10:05:10,810 - INFO - Top 1 acc (%): 78.69 +2024-05-30 10:05:10,810 - INFO - Top 3 acc (%): 91.11 +2024-05-30 10:05:10,810 - INFO - Top 5 acc (%): 93.1 +2024-05-30 10:05:10,810 - INFO - Top 10 acc (%): 95.62 +2024-05-30 10:05:10,810 - INFO - +Sphere2Vec-dfs +2024-05-30 10:05:10,810 - INFO - Model : model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-30 10:05:10,830 - INFO - (980,) +2024-05-30 10:05:10,863 - INFO - Split ID: 0 +2024-05-30 10:05:10,863 - INFO - Top 1 LocEnc acc (%): 3.78 +2024-05-30 10:05:10,863 - INFO - Top 3 LocEnc acc (%): 9.29 +2024-05-30 10:05:10,863 - INFO - Top 5 LocEnc acc (%): 13.57 +2024-05-30 10:05:10,863 - INFO - Top 10 LocEnc acc (%): 21.73 +2024-05-31 03:57:08,166 - INFO - +num_classes 500 +2024-05-31 03:57:08,166 - INFO - num train 42490 +2024-05-31 03:57:08,166 - INFO - num val 980 +2024-05-31 03:57:08,166 - INFO - train loss full_loss +2024-05-31 03:57:08,166 - INFO - model name ../models/sphere2vec_dfs/model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:57:08,166 - INFO - num users 5763 +2024-05-31 03:57:08,166 - INFO - meta data ebird_meta +2024-05-31 03:57:08,996 - INFO - +Only Sphere2Vec-dfs +2024-05-31 03:57:08,996 - INFO - Model : model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:57:09,011 - INFO - Saving output model to ../models/sphere2vec_dfs/model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:57:09,142 - INFO - Saving output model to ../models/sphere2vec_dfs/model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:57:09,270 - INFO - +No prior +2024-05-31 03:57:09,273 - INFO - (2262,) +2024-05-31 03:57:09,407 - INFO - Save results to ../eval_results/eval_birdsnap_ebird_meta_test_no_prior.csv +2024-05-31 03:57:09,407 - INFO - Split ID: 0 +2024-05-31 03:57:09,407 - INFO - Top 1 acc (%): 70.07 +2024-05-31 03:57:09,407 - INFO - Top 3 acc (%): 86.6 +2024-05-31 03:57:09,407 - INFO - Top 5 acc (%): 90.05 +2024-05-31 03:57:09,407 - INFO - Top 10 acc (%): 92.88 +2024-05-31 03:57:11,355 - INFO - Split ID: 0 +2024-05-31 03:57:11,355 - INFO - Top 1 hit (%): 78.69 +2024-05-31 03:57:11,356 - INFO - Top 3 hit (%): 91.11 +2024-05-31 03:57:11,356 - INFO - Top 5 hit (%): 93.1 +2024-05-31 03:57:11,356 - INFO - Top 10 hit (%): 95.62 +2024-05-31 03:57:11,356 - INFO - +Only Sphere2Vec-dfs +2024-05-31 03:57:11,356 - INFO - Model : model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:57:11,406 - INFO - (980,) +2024-05-31 03:57:11,446 - INFO - Split ID: 0 +2024-05-31 03:57:11,446 - INFO - Top 1 LocEnc acc (%): 3.78 +2024-05-31 03:57:11,447 - INFO - Top 3 LocEnc acc (%): 9.29 +2024-05-31 03:57:11,447 - INFO - Top 5 LocEnc acc (%): 13.57 +2024-05-31 03:57:11,447 - INFO - Top 10 LocEnc acc (%): 21.73 diff --git a/pre_trained_models/sphere2vec_dfs/model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/sphere2vec_dfs/model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..6f10bb63 Binary files /dev/null and b/pre_trained_models/sphere2vec_dfs/model_birdsnap_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.1000000_1.000_3_512_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/sphere2vec_dfs/model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.log b/pre_trained_models/sphere2vec_dfs/model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..cea6583c --- /dev/null +++ b/pre_trained_models/sphere2vec_dfs/model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.log @@ -0,0 +1,226 @@ +2024-05-28 11:13:17,474 - INFO - +num_classes 500 +2024-05-28 11:13:17,474 - INFO - num train 19133 +2024-05-28 11:13:17,474 - INFO - num val 443 +2024-05-28 11:13:17,474 - INFO - train loss full_loss +2024-05-28 11:13:17,474 - INFO - model name ../models/sphere2vec_dfs/model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-28 11:13:17,474 - INFO - num users 2872 +2024-05-28 11:13:17,474 - INFO - meta data orig_meta +2024-05-28 11:13:18,425 - INFO - +Epoch 0 +2024-05-28 11:13:38,131 - INFO - [16384/19133] Loss : 1.9478 +2024-05-28 11:13:38,555 - INFO - Test loss : 0.7846 +2024-05-28 11:13:38,555 - INFO - +Epoch 1 +2024-05-28 11:13:57,667 - INFO - [16384/19133] Loss : 1.5133 +2024-05-28 11:13:58,094 - INFO - Test loss : 0.5164 +2024-05-28 11:13:58,094 - INFO - +Epoch 2 +2024-05-28 11:14:17,504 - INFO - [16384/19133] Loss : 1.4047 +2024-05-28 11:14:17,952 - INFO - Test loss : 0.5168 +2024-05-28 11:14:17,952 - INFO - +Epoch 3 +2024-05-28 11:14:36,696 - INFO - [16384/19133] Loss : 1.3407 +2024-05-28 11:14:37,037 - INFO - Test loss : 0.4175 +2024-05-28 11:14:37,037 - INFO - +Epoch 4 +2024-05-28 11:14:55,343 - INFO - [16384/19133] Loss : 1.3042 +2024-05-28 11:14:55,788 - INFO - Test loss : 0.4175 +2024-05-28 11:14:55,788 - INFO - +Epoch 5 +2024-05-28 11:15:14,404 - INFO - [16384/19133] Loss : 1.2707 +2024-05-28 11:15:14,843 - INFO - Test loss : 0.4085 +2024-05-28 11:15:14,843 - INFO - +Epoch 6 +2024-05-28 11:15:33,823 - INFO - [16384/19133] Loss : 1.2457 +2024-05-28 11:15:34,288 - INFO - Test loss : 0.3887 +2024-05-28 11:15:34,288 - INFO - +Epoch 7 +2024-05-28 11:15:53,129 - INFO - [16384/19133] Loss : 1.2177 +2024-05-28 11:15:53,474 - INFO - Test loss : 0.4191 +2024-05-28 11:15:53,474 - INFO - +Epoch 8 +2024-05-28 11:16:12,524 - INFO - [16384/19133] Loss : 1.1989 +2024-05-28 11:16:12,962 - INFO - Test loss : 0.3859 +2024-05-28 11:16:12,962 - INFO - +Epoch 9 +2024-05-28 11:16:31,629 - INFO - [16384/19133] Loss : 1.1847 +2024-05-28 11:16:31,989 - INFO - Test loss : 0.4202 +2024-05-28 11:16:31,989 - INFO - +Epoch 10 +2024-05-28 11:16:50,324 - INFO - [16384/19133] Loss : 1.1741 +2024-05-28 11:16:50,768 - INFO - Test loss : 0.4166 +2024-05-28 11:16:50,809 - INFO - (443,) +2024-05-28 11:16:50,828 - INFO - Split ID: 0 +2024-05-28 11:16:50,828 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 2.48 +2024-05-28 11:16:50,828 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 5.87 +2024-05-28 11:16:50,828 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 8.58 +2024-05-28 11:16:50,828 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 12.19 +2024-05-28 11:16:50,828 - INFO - +No prior +2024-05-28 11:16:50,829 - INFO - (2262,) +2024-05-28 11:16:50,896 - INFO - Split ID: 0 +2024-05-28 11:16:50,896 - INFO - Top 1 (Epoch 10)acc (%): 70.07 +2024-05-28 11:16:50,896 - INFO - Top 3 (Epoch 10)acc (%): 86.6 +2024-05-28 11:16:50,896 - INFO - Top 5 (Epoch 10)acc (%): 90.05 +2024-05-28 11:16:50,896 - INFO - Top 10 (Epoch 10)acc (%): 92.88 +2024-05-28 11:16:52,430 - INFO - Split ID: 0 +2024-05-28 11:16:52,430 - INFO - Top 1 (Epoch 10)acc (%): 71.13 +2024-05-28 11:16:52,430 - INFO - Top 3 (Epoch 10)acc (%): 86.56 +2024-05-28 11:16:52,430 - INFO - Top 5 (Epoch 10)acc (%): 89.83 +2024-05-28 11:16:52,430 - INFO - Top 10 (Epoch 10)acc (%): 93.15 +2024-05-28 11:16:52,430 - INFO - +Epoch 11 +2024-05-28 11:17:11,324 - INFO - [16384/19133] Loss : 1.1567 +2024-05-28 11:17:11,683 - INFO - Test loss : 0.4046 +2024-05-28 11:17:11,683 - INFO - +Epoch 12 +2024-05-28 11:17:30,705 - INFO - [16384/19133] Loss : 1.1508 +2024-05-28 11:17:31,154 - INFO - Test loss : 0.3950 +2024-05-28 11:17:31,154 - INFO - +Epoch 13 +2024-05-28 11:17:49,988 - INFO - [16384/19133] Loss : 1.1363 +2024-05-28 11:17:50,356 - INFO - Test loss : 0.4269 +2024-05-28 11:17:50,356 - INFO - +Epoch 14 +2024-05-28 11:18:09,237 - INFO - [16384/19133] Loss : 1.1240 +2024-05-28 11:18:09,680 - INFO - Test loss : 0.4370 +2024-05-28 11:18:09,680 - INFO - +Epoch 15 +2024-05-28 11:18:28,510 - INFO - [16384/19133] Loss : 1.1192 +2024-05-28 11:18:28,956 - INFO - Test loss : 0.4143 +2024-05-28 11:18:28,956 - INFO - +Epoch 16 +2024-05-28 11:18:47,604 - INFO - [16384/19133] Loss : 1.1099 +2024-05-28 11:18:48,042 - INFO - Test loss : 0.4120 +2024-05-28 11:18:48,042 - INFO - +Epoch 17 +2024-05-28 11:19:06,799 - INFO - [16384/19133] Loss : 1.1025 +2024-05-28 11:19:07,253 - INFO - Test loss : 0.4324 +2024-05-28 11:19:07,254 - INFO - +Epoch 18 +2024-05-28 11:19:25,729 - INFO - [16384/19133] Loss : 1.0973 +2024-05-28 11:19:26,138 - INFO - Test loss : 0.4605 +2024-05-28 11:19:26,138 - INFO - +Epoch 19 +2024-05-28 11:19:46,684 - INFO - [16384/19133] Loss : 1.0963 +2024-05-28 11:19:47,160 - INFO - Test loss : 0.4666 +2024-05-28 11:19:47,160 - INFO - +Epoch 20 +2024-05-28 11:20:06,294 - INFO - [16384/19133] Loss : 1.0853 +2024-05-28 11:20:06,733 - INFO - Test loss : 0.4564 +2024-05-28 11:20:06,743 - INFO - (443,) +2024-05-28 11:20:06,759 - INFO - Split ID: 0 +2024-05-28 11:20:06,759 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 2.71 +2024-05-28 11:20:06,759 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 6.32 +2024-05-28 11:20:06,759 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 8.13 +2024-05-28 11:20:06,759 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 15.35 +2024-05-28 11:20:06,759 - INFO - +No prior +2024-05-28 11:20:06,760 - INFO - (2262,) +2024-05-28 11:20:06,813 - INFO - Split ID: 0 +2024-05-28 11:20:06,813 - INFO - Top 1 (Epoch 20)acc (%): 70.07 +2024-05-28 11:20:06,813 - INFO - Top 3 (Epoch 20)acc (%): 86.6 +2024-05-28 11:20:06,813 - INFO - Top 5 (Epoch 20)acc (%): 90.05 +2024-05-28 11:20:06,813 - INFO - Top 10 (Epoch 20)acc (%): 92.88 +2024-05-28 11:20:08,355 - INFO - Split ID: 0 +2024-05-28 11:20:08,355 - INFO - Top 1 (Epoch 20)acc (%): 71.62 +2024-05-28 11:20:08,355 - INFO - Top 3 (Epoch 20)acc (%): 86.52 +2024-05-28 11:20:08,355 - INFO - Top 5 (Epoch 20)acc (%): 89.79 +2024-05-28 11:20:08,355 - INFO - Top 10 (Epoch 20)acc (%): 93.15 +2024-05-28 11:20:08,355 - INFO - +Epoch 21 +2024-05-28 11:20:27,024 - INFO - [16384/19133] Loss : 1.0839 +2024-05-28 11:20:27,370 - INFO - Test loss : 0.4436 +2024-05-28 11:20:27,370 - INFO - +Epoch 22 +2024-05-28 11:20:46,225 - INFO - [16384/19133] Loss : 1.0801 +2024-05-28 11:20:46,672 - INFO - Test loss : 0.4594 +2024-05-28 11:20:46,672 - INFO - +Epoch 23 +2024-05-28 11:21:05,581 - INFO - [16384/19133] Loss : 1.0730 +2024-05-28 11:21:06,020 - INFO - Test loss : 0.4421 +2024-05-28 11:21:06,020 - INFO - +Epoch 24 +2024-05-28 11:21:24,533 - INFO - [16384/19133] Loss : 1.0687 +2024-05-28 11:21:24,946 - INFO - Test loss : 0.4515 +2024-05-28 11:21:24,946 - INFO - +Epoch 25 +2024-05-28 11:21:43,898 - INFO - [16384/19133] Loss : 1.0660 +2024-05-28 11:21:44,303 - INFO - Test loss : 0.4584 +2024-05-28 11:21:44,304 - INFO - +Epoch 26 +2024-05-28 11:22:02,974 - INFO - [16384/19133] Loss : 1.0600 +2024-05-28 11:22:03,397 - INFO - Test loss : 0.4688 +2024-05-28 11:22:03,398 - INFO - +Epoch 27 +2024-05-28 11:22:22,408 - INFO - [16384/19133] Loss : 1.0548 +2024-05-28 11:22:22,847 - INFO - Test loss : 0.4569 +2024-05-28 11:22:22,847 - INFO - +Epoch 28 +2024-05-28 11:22:41,562 - INFO - [16384/19133] Loss : 1.0530 +2024-05-28 11:22:41,903 - INFO - Test loss : 0.4605 +2024-05-28 11:22:41,903 - INFO - +Epoch 29 +2024-05-28 11:23:00,790 - INFO - [16384/19133] Loss : 1.0525 +2024-05-28 11:23:01,216 - INFO - Test loss : 0.4755 +2024-05-28 11:23:01,216 - INFO - Saving output model to ../models/sphere2vec_dfs/model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-28 11:23:01,261 - INFO - Saving output model to ../models/sphere2vec_dfs/model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-28 11:23:01,415 - INFO - +No prior +2024-05-28 11:23:01,416 - INFO - (2262,) +2024-05-28 11:23:01,471 - INFO - Split ID: 0 +2024-05-28 11:23:01,471 - INFO - Top 1 acc (%): 70.07 +2024-05-28 11:23:01,471 - INFO - Top 3 acc (%): 86.6 +2024-05-28 11:23:01,471 - INFO - Top 5 acc (%): 90.05 +2024-05-28 11:23:01,471 - INFO - Top 10 acc (%): 92.88 +2024-05-28 11:23:02,971 - INFO - Split ID: 0 +2024-05-28 11:23:02,971 - INFO - Top 1 acc (%): 71.79 +2024-05-28 11:23:02,971 - INFO - Top 3 acc (%): 86.6 +2024-05-28 11:23:02,971 - INFO - Top 5 acc (%): 89.83 +2024-05-28 11:23:02,972 - INFO - Top 10 acc (%): 93.1 +2024-05-28 11:23:02,972 - INFO - +Sphere2Vec-dfs +2024-05-28 11:23:02,972 - INFO - Model : model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-28 11:23:02,994 - INFO - (443,) +2024-05-28 11:23:03,010 - INFO - Split ID: 0 +2024-05-28 11:23:03,011 - INFO - Top 1 LocEnc acc (%): 2.48 +2024-05-28 11:23:03,011 - INFO - Top 3 LocEnc acc (%): 5.87 +2024-05-28 11:23:03,011 - INFO - Top 5 LocEnc acc (%): 8.8 +2024-05-28 11:23:03,011 - INFO - Top 10 LocEnc acc (%): 14.9 +2024-05-31 03:56:51,106 - INFO - +num_classes 500 +2024-05-31 03:56:51,106 - INFO - num train 19133 +2024-05-31 03:56:51,106 - INFO - num val 443 +2024-05-31 03:56:51,106 - INFO - train loss full_loss +2024-05-31 03:56:51,106 - INFO - model name ../models/sphere2vec_dfs/model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:56:51,106 - INFO - num users 2872 +2024-05-31 03:56:51,106 - INFO - meta data orig_meta +2024-05-31 03:56:51,967 - INFO - +Only Sphere2Vec-dfs +2024-05-31 03:56:51,967 - INFO - Model : model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:56:52,233 - INFO - Saving output model to ../models/sphere2vec_dfs/model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:56:52,482 - INFO - Saving output model to ../models/sphere2vec_dfs/model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:56:52,742 - INFO - +No prior +2024-05-31 03:56:52,743 - INFO - (2262,) +2024-05-31 03:56:52,858 - INFO - Save results to ../eval_results/eval_birdsnap_orig_meta_test_no_prior.csv +2024-05-31 03:56:52,858 - INFO - Split ID: 0 +2024-05-31 03:56:52,858 - INFO - Top 1 acc (%): 70.07 +2024-05-31 03:56:52,859 - INFO - Top 3 acc (%): 86.6 +2024-05-31 03:56:52,859 - INFO - Top 5 acc (%): 90.05 +2024-05-31 03:56:52,859 - INFO - Top 10 acc (%): 92.88 +2024-05-31 03:56:53,710 - INFO - Split ID: 0 +2024-05-31 03:56:53,710 - INFO - Top 1 hit (%): 71.79 +2024-05-31 03:56:53,710 - INFO - Top 3 hit (%): 86.6 +2024-05-31 03:56:53,710 - INFO - Top 5 hit (%): 89.83 +2024-05-31 03:56:53,710 - INFO - Top 10 hit (%): 93.1 +2024-05-31 03:56:53,711 - INFO - +Only Sphere2Vec-dfs +2024-05-31 03:56:53,711 - INFO - Model : model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:56:53,765 - INFO - (443,) +2024-05-31 03:56:53,780 - INFO - Split ID: 0 +2024-05-31 03:56:53,781 - INFO - Top 1 LocEnc acc (%): 2.48 +2024-05-31 03:56:53,781 - INFO - Top 3 LocEnc acc (%): 5.87 +2024-05-31 03:56:53,781 - INFO - Top 5 LocEnc acc (%): 8.8 +2024-05-31 03:56:53,781 - INFO - Top 10 LocEnc acc (%): 14.9 diff --git a/pre_trained_models/sphere2vec_dfs/model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/sphere2vec_dfs/model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..ff75232e Binary files /dev/null and b/pre_trained_models/sphere2vec_dfs/model_birdsnap_orig_meta_Sphere2Vec-dfs_inception_v3_0.0100_32_0.0200000_1.000_1_512_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.log b/pre_trained_models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.log new file mode 100755 index 00000000..c5d8e70b --- /dev/null +++ b/pre_trained_models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.log @@ -0,0 +1,784 @@ +2024-05-30 09:41:39,097 - INFO - +num_classes 62 +2024-05-30 09:41:39,098 - INFO - num train 363571 +2024-05-30 09:41:39,098 - INFO - num val 53041 +2024-05-30 09:41:39,098 - INFO - train loss full_loss +2024-05-30 09:41:39,098 - INFO - model name ../models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-30 09:41:39,098 - INFO - num users 1 +2024-05-30 09:41:40,060 - INFO - +Epoch 0 +2024-05-30 09:41:41,694 - INFO - [0/363571] Loss : 2.2202 +2024-05-30 09:41:47,519 - INFO - Test loss : 1.1618 +2024-05-30 09:41:47,520 - INFO - +Epoch 1 +2024-05-30 09:41:48,696 - INFO - [0/363571] Loss : 1.9837 +2024-05-30 09:41:54,516 - INFO - Test loss : 1.2002 +2024-05-30 09:41:54,516 - INFO - +Epoch 2 +2024-05-30 09:41:55,587 - INFO - [0/363571] Loss : 2.0065 +2024-05-30 09:42:01,442 - INFO - Test loss : 1.1506 +2024-05-30 09:42:01,443 - INFO - +Epoch 3 +2024-05-30 09:42:02,531 - INFO - [0/363571] Loss : 1.8848 +2024-05-30 09:42:08,380 - INFO - Test loss : 1.0695 +2024-05-30 09:42:08,380 - INFO - +Epoch 4 +2024-05-30 09:42:09,446 - INFO - [0/363571] Loss : 1.8615 +2024-05-30 09:42:15,303 - INFO - Test loss : 0.7848 +2024-05-30 09:42:15,303 - INFO - +Epoch 5 +2024-05-30 09:42:16,379 - INFO - [0/363571] Loss : 1.7937 +2024-05-30 09:42:22,186 - INFO - Test loss : 0.8559 +2024-05-30 09:42:22,186 - INFO - +Epoch 6 +2024-05-30 09:42:23,339 - INFO - [0/363571] Loss : 1.8163 +2024-05-30 09:42:29,213 - INFO - Test loss : 0.8351 +2024-05-30 09:42:29,213 - INFO - +Epoch 7 +2024-05-30 09:42:30,286 - INFO - [0/363571] Loss : 1.7729 +2024-05-30 09:42:36,115 - INFO - Test loss : 0.8415 +2024-05-30 09:42:36,115 - INFO - +Epoch 8 +2024-05-30 09:42:37,275 - INFO - [0/363571] Loss : 1.7584 +2024-05-30 09:42:43,070 - INFO - Test loss : 0.9101 +2024-05-30 09:42:43,070 - INFO - +Epoch 9 +2024-05-30 09:42:44,140 - INFO - [0/363571] Loss : 1.7663 +2024-05-30 09:42:49,935 - INFO - Test loss : 0.8974 +2024-05-30 09:42:49,935 - INFO - +Epoch 10 +2024-05-30 09:42:51,097 - INFO - [0/363571] Loss : 1.7402 +2024-05-30 09:42:56,954 - INFO - Test loss : 0.8509 +2024-05-30 09:43:00,849 - INFO - (53041,) +2024-05-30 09:43:00,983 - INFO - Split ID: 0 +2024-05-30 09:43:01,003 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 6.96 +2024-05-30 09:43:01,005 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 15.86 +2024-05-30 09:43:01,007 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 28.89 +2024-05-30 09:43:01,008 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 40.33 +2024-05-30 09:43:01,015 - INFO - +No prior +2024-05-30 09:43:01,020 - INFO - (53041,) +2024-05-30 09:43:01,173 - INFO - Split ID: 0 +2024-05-30 09:43:01,173 - INFO - Top 1 (Epoch 10)acc (%): 69.83 +2024-05-30 09:43:01,173 - INFO - Top 3 (Epoch 10)acc (%): 84.61 +2024-05-30 09:43:01,173 - INFO - Top 5 (Epoch 10)acc (%): 89.23 +2024-05-30 09:43:01,173 - INFO - Top 10 (Epoch 10)acc (%): 94.29 +2024-05-30 09:43:24,793 - INFO - Split ID: 0 +2024-05-30 09:43:24,793 - INFO - Top 1 (Epoch 10)acc (%): 70.05 +2024-05-30 09:43:24,793 - INFO - Top 3 (Epoch 10)acc (%): 84.87 +2024-05-30 09:43:24,793 - INFO - Top 5 (Epoch 10)acc (%): 89.43 +2024-05-30 09:43:24,793 - INFO - Top 10 (Epoch 10)acc (%): 94.46 +2024-05-30 09:43:24,797 - INFO - +Epoch 11 +2024-05-30 09:43:25,904 - INFO - [0/363571] Loss : 1.7355 +2024-05-30 09:43:31,758 - INFO - Test loss : 0.8121 +2024-05-30 09:43:31,758 - INFO - +Epoch 12 +2024-05-30 09:43:32,858 - INFO - [0/363571] Loss : 1.7328 +2024-05-30 09:43:38,816 - INFO - Test loss : 0.8098 +2024-05-30 09:43:38,816 - INFO - +Epoch 13 +2024-05-30 09:43:39,904 - INFO - [0/363571] Loss : 1.6964 +2024-05-30 09:43:45,760 - INFO - Test loss : 0.8468 +2024-05-30 09:43:45,760 - INFO - +Epoch 14 +2024-05-30 09:43:46,851 - INFO - [0/363571] Loss : 1.7050 +2024-05-30 09:43:52,708 - INFO - Test loss : 0.8936 +2024-05-30 09:43:52,708 - INFO - +Epoch 15 +2024-05-30 09:43:53,872 - INFO - [0/363571] Loss : 1.6987 +2024-05-30 09:43:59,728 - INFO - Test loss : 0.8845 +2024-05-30 09:43:59,728 - INFO - +Epoch 16 +2024-05-30 09:44:00,811 - INFO - [0/363571] Loss : 1.6838 +2024-05-30 09:44:06,660 - INFO - Test loss : 0.8714 +2024-05-30 09:44:06,660 - INFO - +Epoch 17 +2024-05-30 09:44:07,823 - INFO - [0/363571] Loss : 1.6772 +2024-05-30 09:44:13,671 - INFO - Test loss : 0.8738 +2024-05-30 09:44:13,672 - INFO - +Epoch 18 +2024-05-30 09:44:14,750 - INFO - [0/363571] Loss : 1.6710 +2024-05-30 09:44:20,623 - INFO - Test loss : 0.8650 +2024-05-30 09:44:20,623 - INFO - +Epoch 19 +2024-05-30 09:44:21,708 - INFO - [0/363571] Loss : 1.6677 +2024-05-30 09:44:27,652 - INFO - Test loss : 0.8414 +2024-05-30 09:44:27,652 - INFO - +Epoch 20 +2024-05-30 09:44:28,741 - INFO - [0/363571] Loss : 1.6551 +2024-05-30 09:44:34,610 - INFO - Test loss : 0.8431 +2024-05-30 09:44:38,575 - INFO - (53041,) +2024-05-30 09:44:38,715 - INFO - Split ID: 0 +2024-05-30 09:44:38,736 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 7.96 +2024-05-30 09:44:38,738 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 20.31 +2024-05-30 09:44:38,739 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 26.68 +2024-05-30 09:44:38,741 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 38.97 +2024-05-30 09:44:38,752 - INFO - +No prior +2024-05-30 09:44:38,760 - INFO - (53041,) +2024-05-30 09:44:38,913 - INFO - Split ID: 0 +2024-05-30 09:44:38,913 - INFO - Top 1 (Epoch 20)acc (%): 69.83 +2024-05-30 09:44:38,915 - INFO - Top 3 (Epoch 20)acc (%): 84.61 +2024-05-30 09:44:38,915 - INFO - Top 5 (Epoch 20)acc (%): 89.23 +2024-05-30 09:44:38,915 - INFO - Top 10 (Epoch 20)acc (%): 94.29 +2024-05-30 09:45:02,624 - INFO - Split ID: 0 +2024-05-30 09:45:02,624 - INFO - Top 1 (Epoch 20)acc (%): 70.11 +2024-05-30 09:45:02,624 - INFO - Top 3 (Epoch 20)acc (%): 84.82 +2024-05-30 09:45:02,624 - INFO - Top 5 (Epoch 20)acc (%): 89.48 +2024-05-30 09:45:02,624 - INFO - Top 10 (Epoch 20)acc (%): 94.44 +2024-05-30 09:45:02,626 - INFO - +Epoch 21 +2024-05-30 09:45:03,708 - INFO - [0/363571] Loss : 1.6515 +2024-05-30 09:45:09,576 - INFO - Test loss : 0.8552 +2024-05-30 09:45:09,576 - INFO - +Epoch 22 +2024-05-30 09:45:10,740 - INFO - [0/363571] Loss : 1.6543 +2024-05-30 09:45:16,584 - INFO - Test loss : 0.8654 +2024-05-30 09:45:16,584 - INFO - +Epoch 23 +2024-05-30 09:45:17,673 - INFO - [0/363571] Loss : 1.6428 +2024-05-30 09:45:23,509 - INFO - Test loss : 0.8533 +2024-05-30 09:45:23,509 - INFO - +Epoch 24 +2024-05-30 09:45:24,676 - INFO - [0/363571] Loss : 1.6414 +2024-05-30 09:45:30,448 - INFO - Test loss : 0.8360 +2024-05-30 09:45:30,448 - INFO - +Epoch 25 +2024-05-30 09:45:31,515 - INFO - [0/363571] Loss : 1.6421 +2024-05-30 09:45:37,283 - INFO - Test loss : 0.8263 +2024-05-30 09:45:37,283 - INFO - +Epoch 26 +2024-05-30 09:45:38,435 - INFO - [0/363571] Loss : 1.6307 +2024-05-30 09:45:44,212 - INFO - Test loss : 0.8310 +2024-05-30 09:45:44,212 - INFO - +Epoch 27 +2024-05-30 09:45:45,278 - INFO - [0/363571] Loss : 1.6264 +2024-05-30 09:45:51,045 - INFO - Test loss : 0.8475 +2024-05-30 09:45:51,045 - INFO - +Epoch 28 +2024-05-30 09:45:52,113 - INFO - [0/363571] Loss : 1.6296 +2024-05-30 09:45:57,972 - INFO - Test loss : 0.8536 +2024-05-30 09:45:57,972 - INFO - +Epoch 29 +2024-05-30 09:45:59,044 - INFO - [0/363571] Loss : 1.6304 +2024-05-30 09:46:04,826 - INFO - Test loss : 0.8350 +2024-05-30 09:46:04,826 - INFO - +Epoch 30 +2024-05-30 09:46:05,893 - INFO - [0/363571] Loss : 1.6129 +2024-05-30 09:46:11,672 - INFO - Test loss : 0.8234 +2024-05-30 09:46:15,541 - INFO - (53041,) +2024-05-30 09:46:15,678 - INFO - Split ID: 0 +2024-05-30 09:46:15,699 - INFO - Top 1 LocEnc (Epoch 30)acc (%): 7.19 +2024-05-30 09:46:15,701 - INFO - Top 3 LocEnc (Epoch 30)acc (%): 18.18 +2024-05-30 09:46:15,702 - INFO - Top 5 LocEnc (Epoch 30)acc (%): 26.87 +2024-05-30 09:46:15,704 - INFO - Top 10 LocEnc (Epoch 30)acc (%): 39.38 +2024-05-30 09:46:15,715 - INFO - +No prior +2024-05-30 09:46:15,724 - INFO - (53041,) +2024-05-30 09:46:15,875 - INFO - Split ID: 0 +2024-05-30 09:46:15,875 - INFO - Top 1 (Epoch 30)acc (%): 69.83 +2024-05-30 09:46:15,876 - INFO - Top 3 (Epoch 30)acc (%): 84.61 +2024-05-30 09:46:15,876 - INFO - Top 5 (Epoch 30)acc (%): 89.23 +2024-05-30 09:46:15,876 - INFO - Top 10 (Epoch 30)acc (%): 94.29 +2024-05-30 09:46:39,720 - INFO - Split ID: 0 +2024-05-30 09:46:39,721 - INFO - Top 1 (Epoch 30)acc (%): 70.16 +2024-05-30 09:46:39,721 - INFO - Top 3 (Epoch 30)acc (%): 84.89 +2024-05-30 09:46:39,721 - INFO - Top 5 (Epoch 30)acc (%): 89.5 +2024-05-30 09:46:39,721 - INFO - Top 10 (Epoch 30)acc (%): 94.46 +2024-05-30 09:46:39,723 - INFO - +Epoch 31 +2024-05-30 09:46:40,874 - INFO - [0/363571] Loss : 1.6245 +2024-05-30 09:46:46,654 - INFO - Test loss : 0.8237 +2024-05-30 09:46:46,654 - INFO - +Epoch 32 +2024-05-30 09:46:47,733 - INFO - [0/363571] Loss : 1.6182 +2024-05-30 09:46:53,490 - INFO - Test loss : 0.8276 +2024-05-30 09:46:53,490 - INFO - +Epoch 33 +2024-05-30 09:46:54,636 - INFO - [0/363571] Loss : 1.6062 +2024-05-30 09:47:00,494 - INFO - Test loss : 0.8319 +2024-05-30 09:47:00,494 - INFO - +Epoch 34 +2024-05-30 09:47:01,581 - INFO - [0/363571] Loss : 1.6021 +2024-05-30 09:47:07,451 - INFO - Test loss : 0.8303 +2024-05-30 09:47:07,451 - INFO - +Epoch 35 +2024-05-30 09:47:08,532 - INFO - [0/363571] Loss : 1.5951 +2024-05-30 09:47:14,442 - INFO - Test loss : 0.8302 +2024-05-30 09:47:14,443 - INFO - +Epoch 36 +2024-05-30 09:47:15,522 - INFO - [0/363571] Loss : 1.6058 +2024-05-30 09:47:21,334 - INFO - Test loss : 0.8303 +2024-05-30 09:47:21,334 - INFO - +Epoch 37 +2024-05-30 09:47:22,408 - INFO - [0/363571] Loss : 1.5959 +2024-05-30 09:47:28,211 - INFO - Test loss : 0.8325 +2024-05-30 09:47:28,211 - INFO - +Epoch 38 +2024-05-30 09:47:29,352 - INFO - [0/363571] Loss : 1.6044 +2024-05-30 09:47:35,157 - INFO - Test loss : 0.8315 +2024-05-30 09:47:35,157 - INFO - +Epoch 39 +2024-05-30 09:47:36,231 - INFO - [0/363571] Loss : 1.5936 +2024-05-30 09:47:42,023 - INFO - Test loss : 0.8325 +2024-05-30 09:47:42,023 - INFO - +Epoch 40 +2024-05-30 09:47:43,181 - INFO - [0/363571] Loss : 1.5941 +2024-05-30 09:47:48,984 - INFO - Test loss : 0.8252 +2024-05-30 09:47:52,865 - INFO - (53041,) +2024-05-30 09:47:53,004 - INFO - Split ID: 0 +2024-05-30 09:47:53,026 - INFO - Top 1 LocEnc (Epoch 40)acc (%): 8.5 +2024-05-30 09:47:53,028 - INFO - Top 3 LocEnc (Epoch 40)acc (%): 19.64 +2024-05-30 09:47:53,029 - INFO - Top 5 LocEnc (Epoch 40)acc (%): 28.71 +2024-05-30 09:47:53,031 - INFO - Top 10 LocEnc (Epoch 40)acc (%): 41.76 +2024-05-30 09:47:53,042 - INFO - +No prior +2024-05-30 09:47:53,051 - INFO - (53041,) +2024-05-30 09:47:53,201 - INFO - Split ID: 0 +2024-05-30 09:47:53,202 - INFO - Top 1 (Epoch 40)acc (%): 69.83 +2024-05-30 09:47:53,202 - INFO - Top 3 (Epoch 40)acc (%): 84.61 +2024-05-30 09:47:53,202 - INFO - Top 5 (Epoch 40)acc (%): 89.23 +2024-05-30 09:47:53,202 - INFO - Top 10 (Epoch 40)acc (%): 94.29 +2024-05-30 09:48:16,804 - INFO - Split ID: 0 +2024-05-30 09:48:16,804 - INFO - Top 1 (Epoch 40)acc (%): 70.17 +2024-05-30 09:48:16,804 - INFO - Top 3 (Epoch 40)acc (%): 84.89 +2024-05-30 09:48:16,804 - INFO - Top 5 (Epoch 40)acc (%): 89.51 +2024-05-30 09:48:16,804 - INFO - Top 10 (Epoch 40)acc (%): 94.47 +2024-05-30 09:48:16,806 - INFO - +Epoch 41 +2024-05-30 09:48:17,874 - INFO - [0/363571] Loss : 1.5970 +2024-05-30 09:48:23,654 - INFO - Test loss : 0.8200 +2024-05-30 09:48:23,654 - INFO - +Epoch 42 +2024-05-30 09:48:24,801 - INFO - [0/363571] Loss : 1.5920 +2024-05-30 09:48:30,585 - INFO - Test loss : 0.8158 +2024-05-30 09:48:30,585 - INFO - +Epoch 43 +2024-05-30 09:48:31,659 - INFO - [0/363571] Loss : 1.5918 +2024-05-30 09:48:37,482 - INFO - Test loss : 0.8128 +2024-05-30 09:48:37,482 - INFO - +Epoch 44 +2024-05-30 09:48:38,561 - INFO - [0/363571] Loss : 1.5856 +2024-05-30 09:48:44,469 - INFO - Test loss : 0.8152 +2024-05-30 09:48:44,469 - INFO - +Epoch 45 +2024-05-30 09:48:45,532 - INFO - [0/363571] Loss : 1.5809 +2024-05-30 09:48:51,338 - INFO - Test loss : 0.8215 +2024-05-30 09:48:51,338 - INFO - +Epoch 46 +2024-05-30 09:48:52,403 - INFO - [0/363571] Loss : 1.5737 +2024-05-30 09:48:58,192 - INFO - Test loss : 0.8349 +2024-05-30 09:48:58,192 - INFO - +Epoch 47 +2024-05-30 09:48:59,339 - INFO - [0/363571] Loss : 1.5746 +2024-05-30 09:49:05,107 - INFO - Test loss : 0.8376 +2024-05-30 09:49:05,107 - INFO - +Epoch 48 +2024-05-30 09:49:06,178 - INFO - [0/363571] Loss : 1.5804 +2024-05-30 09:49:11,968 - INFO - Test loss : 0.8256 +2024-05-30 09:49:11,968 - INFO - +Epoch 49 +2024-05-30 09:49:13,117 - INFO - [0/363571] Loss : 1.5787 +2024-05-30 09:49:18,940 - INFO - Test loss : 0.8216 +2024-05-30 09:49:18,940 - INFO - +Epoch 50 +2024-05-30 09:49:20,003 - INFO - [0/363571] Loss : 1.5780 +2024-05-30 09:49:25,786 - INFO - Test loss : 0.8224 +2024-05-30 09:49:29,647 - INFO - (53041,) +2024-05-30 09:49:29,787 - INFO - Split ID: 0 +2024-05-30 09:49:29,805 - INFO - Top 1 LocEnc (Epoch 50)acc (%): 7.15 +2024-05-30 09:49:29,807 - INFO - Top 3 LocEnc (Epoch 50)acc (%): 21.01 +2024-05-30 09:49:29,809 - INFO - Top 5 LocEnc (Epoch 50)acc (%): 28.01 +2024-05-30 09:49:29,811 - INFO - Top 10 LocEnc (Epoch 50)acc (%): 41.71 +2024-05-30 09:49:29,821 - INFO - +No prior +2024-05-30 09:49:29,830 - INFO - (53041,) +2024-05-30 09:49:29,983 - INFO - Split ID: 0 +2024-05-30 09:49:29,983 - INFO - Top 1 (Epoch 50)acc (%): 69.83 +2024-05-30 09:49:29,983 - INFO - Top 3 (Epoch 50)acc (%): 84.61 +2024-05-30 09:49:29,983 - INFO - Top 5 (Epoch 50)acc (%): 89.23 +2024-05-30 09:49:29,983 - INFO - Top 10 (Epoch 50)acc (%): 94.29 +2024-05-30 09:49:53,740 - INFO - Split ID: 0 +2024-05-30 09:49:53,740 - INFO - Top 1 (Epoch 50)acc (%): 70.19 +2024-05-30 09:49:53,740 - INFO - Top 3 (Epoch 50)acc (%): 84.93 +2024-05-30 09:49:53,740 - INFO - Top 5 (Epoch 50)acc (%): 89.53 +2024-05-30 09:49:53,741 - INFO - Top 10 (Epoch 50)acc (%): 94.44 +2024-05-30 09:49:53,743 - INFO - +Epoch 51 +2024-05-30 09:49:54,819 - INFO - [0/363571] Loss : 1.5826 +2024-05-30 09:50:00,752 - INFO - Test loss : 0.8286 +2024-05-30 09:50:00,752 - INFO - +Epoch 52 +2024-05-30 09:50:01,847 - INFO - [0/363571] Loss : 1.5730 +2024-05-30 09:50:07,679 - INFO - Test loss : 0.8361 +2024-05-30 09:50:07,680 - INFO - +Epoch 53 +2024-05-30 09:50:08,750 - INFO - [0/363571] Loss : 1.5774 +2024-05-30 09:50:14,575 - INFO - Test loss : 0.8360 +2024-05-30 09:50:14,576 - INFO - +Epoch 54 +2024-05-30 09:50:15,725 - INFO - [0/363571] Loss : 1.5679 +2024-05-30 09:50:21,539 - INFO - Test loss : 0.8367 +2024-05-30 09:50:21,539 - INFO - +Epoch 55 +2024-05-30 09:50:22,612 - INFO - [0/363571] Loss : 1.5625 +2024-05-30 09:50:28,481 - INFO - Test loss : 0.8302 +2024-05-30 09:50:28,481 - INFO - +Epoch 56 +2024-05-30 09:50:29,653 - INFO - [0/363571] Loss : 1.5620 +2024-05-30 09:50:35,529 - INFO - Test loss : 0.8263 +2024-05-30 09:50:35,529 - INFO - +Epoch 57 +2024-05-30 09:50:36,621 - INFO - [0/363571] Loss : 1.5591 +2024-05-30 09:50:42,465 - INFO - Test loss : 0.8225 +2024-05-30 09:50:42,465 - INFO - +Epoch 58 +2024-05-30 09:50:43,623 - INFO - [0/363571] Loss : 1.5706 +2024-05-30 09:50:49,479 - INFO - Test loss : 0.8277 +2024-05-30 09:50:49,479 - INFO - +Epoch 59 +2024-05-30 09:50:50,563 - INFO - [0/363571] Loss : 1.5583 +2024-05-30 09:50:56,440 - INFO - Test loss : 0.8321 +2024-05-30 09:50:56,440 - INFO - +Epoch 60 +2024-05-30 09:50:57,519 - INFO - [0/363571] Loss : 1.5624 +2024-05-30 09:51:03,391 - INFO - Test loss : 0.8337 +2024-05-30 09:51:07,255 - INFO - (53041,) +2024-05-30 09:51:07,394 - INFO - Split ID: 0 +2024-05-30 09:51:07,412 - INFO - Top 1 LocEnc (Epoch 60)acc (%): 8.54 +2024-05-30 09:51:07,414 - INFO - Top 3 LocEnc (Epoch 60)acc (%): 21.17 +2024-05-30 09:51:07,416 - INFO - Top 5 LocEnc (Epoch 60)acc (%): 30.48 +2024-05-30 09:51:07,417 - INFO - Top 10 LocEnc (Epoch 60)acc (%): 42.72 +2024-05-30 09:51:07,427 - INFO - +No prior +2024-05-30 09:51:07,436 - INFO - (53041,) +2024-05-30 09:51:07,589 - INFO - Split ID: 0 +2024-05-30 09:51:07,589 - INFO - Top 1 (Epoch 60)acc (%): 69.83 +2024-05-30 09:51:07,589 - INFO - Top 3 (Epoch 60)acc (%): 84.61 +2024-05-30 09:51:07,589 - INFO - Top 5 (Epoch 60)acc (%): 89.23 +2024-05-30 09:51:07,589 - INFO - Top 10 (Epoch 60)acc (%): 94.29 +2024-05-30 09:51:31,196 - INFO - Split ID: 0 +2024-05-30 09:51:31,196 - INFO - Top 1 (Epoch 60)acc (%): 70.24 +2024-05-30 09:51:31,196 - INFO - Top 3 (Epoch 60)acc (%): 84.93 +2024-05-30 09:51:31,196 - INFO - Top 5 (Epoch 60)acc (%): 89.54 +2024-05-30 09:51:31,196 - INFO - Top 10 (Epoch 60)acc (%): 94.49 +2024-05-30 09:51:31,199 - INFO - +Epoch 61 +2024-05-30 09:51:32,264 - INFO - [0/363571] Loss : 1.5562 +2024-05-30 09:51:38,048 - INFO - Test loss : 0.8366 +2024-05-30 09:51:38,048 - INFO - +Epoch 62 +2024-05-30 09:51:39,114 - INFO - [0/363571] Loss : 1.5637 +2024-05-30 09:51:44,903 - INFO - Test loss : 0.8335 +2024-05-30 09:51:44,903 - INFO - +Epoch 63 +2024-05-30 09:51:46,058 - INFO - [0/363571] Loss : 1.5633 +2024-05-30 09:51:51,879 - INFO - Test loss : 0.8257 +2024-05-30 09:51:51,879 - INFO - +Epoch 64 +2024-05-30 09:51:52,964 - INFO - [0/363571] Loss : 1.5574 +2024-05-30 09:51:58,809 - INFO - Test loss : 0.8151 +2024-05-30 09:51:58,809 - INFO - +Epoch 65 +2024-05-30 09:51:59,972 - INFO - [0/363571] Loss : 1.5631 +2024-05-30 09:52:05,819 - INFO - Test loss : 0.8134 +2024-05-30 09:52:05,820 - INFO - +Epoch 66 +2024-05-30 09:52:06,909 - INFO - [0/363571] Loss : 1.5431 +2024-05-30 09:52:12,727 - INFO - Test loss : 0.8196 +2024-05-30 09:52:12,727 - INFO - +Epoch 67 +2024-05-30 09:52:13,820 - INFO - [0/363571] Loss : 1.5544 +2024-05-30 09:52:19,763 - INFO - Test loss : 0.8272 +2024-05-30 09:52:19,763 - INFO - +Epoch 68 +2024-05-30 09:52:20,839 - INFO - [0/363571] Loss : 1.5485 +2024-05-30 09:52:26,671 - INFO - Test loss : 0.8279 +2024-05-30 09:52:26,671 - INFO - +Epoch 69 +2024-05-30 09:52:27,760 - INFO - [0/363571] Loss : 1.5541 +2024-05-30 09:52:33,624 - INFO - Test loss : 0.8245 +2024-05-30 09:52:33,625 - INFO - +Epoch 70 +2024-05-30 09:52:34,776 - INFO - [0/363571] Loss : 1.5490 +2024-05-30 09:52:40,614 - INFO - Test loss : 0.8230 +2024-05-30 09:52:44,520 - INFO - (53041,) +2024-05-30 09:52:44,662 - INFO - Split ID: 0 +2024-05-30 09:52:44,681 - INFO - Top 1 LocEnc (Epoch 70)acc (%): 7.18 +2024-05-30 09:52:44,683 - INFO - Top 3 LocEnc (Epoch 70)acc (%): 22.18 +2024-05-30 09:52:44,684 - INFO - Top 5 LocEnc (Epoch 70)acc (%): 30.2 +2024-05-30 09:52:44,686 - INFO - Top 10 LocEnc (Epoch 70)acc (%): 43.99 +2024-05-30 09:52:44,697 - INFO - +No prior +2024-05-30 09:52:44,706 - INFO - (53041,) +2024-05-30 09:52:44,856 - INFO - Split ID: 0 +2024-05-30 09:52:44,856 - INFO - Top 1 (Epoch 70)acc (%): 69.83 +2024-05-30 09:52:44,857 - INFO - Top 3 (Epoch 70)acc (%): 84.61 +2024-05-30 09:52:44,857 - INFO - Top 5 (Epoch 70)acc (%): 89.23 +2024-05-30 09:52:44,857 - INFO - Top 10 (Epoch 70)acc (%): 94.29 +2024-05-30 09:53:08,692 - INFO - Split ID: 0 +2024-05-30 09:53:08,692 - INFO - Top 1 (Epoch 70)acc (%): 70.22 +2024-05-30 09:53:08,692 - INFO - Top 3 (Epoch 70)acc (%): 84.96 +2024-05-30 09:53:08,692 - INFO - Top 5 (Epoch 70)acc (%): 89.53 +2024-05-30 09:53:08,692 - INFO - Top 10 (Epoch 70)acc (%): 94.49 +2024-05-30 09:53:08,695 - INFO - +Epoch 71 +2024-05-30 09:53:09,773 - INFO - [0/363571] Loss : 1.5516 +2024-05-30 09:53:15,636 - INFO - Test loss : 0.8250 +2024-05-30 09:53:15,637 - INFO - +Epoch 72 +2024-05-30 09:53:16,810 - INFO - [0/363571] Loss : 1.5443 +2024-05-30 09:53:22,657 - INFO - Test loss : 0.8352 +2024-05-30 09:53:22,657 - INFO - +Epoch 73 +2024-05-30 09:53:23,745 - INFO - [0/363571] Loss : 1.5593 +2024-05-30 09:53:29,584 - INFO - Test loss : 0.8384 +2024-05-30 09:53:29,584 - INFO - +Epoch 74 +2024-05-30 09:53:30,738 - INFO - [0/363571] Loss : 1.5482 +2024-05-30 09:53:36,542 - INFO - Test loss : 0.8372 +2024-05-30 09:53:36,542 - INFO - +Epoch 75 +2024-05-30 09:53:37,607 - INFO - [0/363571] Loss : 1.5475 +2024-05-30 09:53:43,391 - INFO - Test loss : 0.8319 +2024-05-30 09:53:43,391 - INFO - +Epoch 76 +2024-05-30 09:53:44,458 - INFO - [0/363571] Loss : 1.5420 +2024-05-30 09:53:50,330 - INFO - Test loss : 0.8262 +2024-05-30 09:53:50,330 - INFO - +Epoch 77 +2024-05-30 09:53:51,427 - INFO - [0/363571] Loss : 1.5541 +2024-05-30 09:53:57,315 - INFO - Test loss : 0.8255 +2024-05-30 09:53:57,315 - INFO - +Epoch 78 +2024-05-30 09:53:58,398 - INFO - [0/363571] Loss : 1.5360 +2024-05-30 09:54:04,259 - INFO - Test loss : 0.8275 +2024-05-30 09:54:04,259 - INFO - +Epoch 79 +2024-05-30 09:54:05,420 - INFO - [0/363571] Loss : 1.5409 +2024-05-30 09:54:11,249 - INFO - Test loss : 0.8302 +2024-05-30 09:54:11,249 - INFO - +Epoch 80 +2024-05-30 09:54:12,338 - INFO - [0/363571] Loss : 1.5321 +2024-05-30 09:54:18,194 - INFO - Test loss : 0.8318 +2024-05-30 09:54:22,133 - INFO - (53041,) +2024-05-30 09:54:22,273 - INFO - Split ID: 0 +2024-05-30 09:54:22,292 - INFO - Top 1 LocEnc (Epoch 80)acc (%): 11.03 +2024-05-30 09:54:22,294 - INFO - Top 3 LocEnc (Epoch 80)acc (%): 23.83 +2024-05-30 09:54:22,295 - INFO - Top 5 LocEnc (Epoch 80)acc (%): 31.16 +2024-05-30 09:54:22,297 - INFO - Top 10 LocEnc (Epoch 80)acc (%): 44.4 +2024-05-30 09:54:22,307 - INFO - +No prior +2024-05-30 09:54:22,316 - INFO - (53041,) +2024-05-30 09:54:22,468 - INFO - Split ID: 0 +2024-05-30 09:54:22,468 - INFO - Top 1 (Epoch 80)acc (%): 69.83 +2024-05-30 09:54:22,469 - INFO - Top 3 (Epoch 80)acc (%): 84.61 +2024-05-30 09:54:22,469 - INFO - Top 5 (Epoch 80)acc (%): 89.23 +2024-05-30 09:54:22,469 - INFO - Top 10 (Epoch 80)acc (%): 94.29 +2024-05-30 09:54:46,238 - INFO - Split ID: 0 +2024-05-30 09:54:46,239 - INFO - Top 1 (Epoch 80)acc (%): 70.22 +2024-05-30 09:54:46,239 - INFO - Top 3 (Epoch 80)acc (%): 84.97 +2024-05-30 09:54:46,239 - INFO - Top 5 (Epoch 80)acc (%): 89.53 +2024-05-30 09:54:46,239 - INFO - Top 10 (Epoch 80)acc (%): 94.48 +2024-05-30 09:54:46,241 - INFO - +Epoch 81 +2024-05-30 09:54:47,421 - INFO - [0/363571] Loss : 1.5333 +2024-05-30 09:54:53,280 - INFO - Test loss : 0.8347 +2024-05-30 09:54:53,280 - INFO - +Epoch 82 +2024-05-30 09:54:54,363 - INFO - [0/363571] Loss : 1.5311 +2024-05-30 09:55:00,235 - INFO - Test loss : 0.8355 +2024-05-30 09:55:00,235 - INFO - +Epoch 83 +2024-05-30 09:55:01,311 - INFO - [0/363571] Loss : 1.5381 +2024-05-30 09:55:07,236 - INFO - Test loss : 0.8352 +2024-05-30 09:55:07,236 - INFO - +Epoch 84 +2024-05-30 09:55:08,330 - INFO - [0/363571] Loss : 1.5465 +2024-05-30 09:55:14,176 - INFO - Test loss : 0.8309 +2024-05-30 09:55:14,176 - INFO - +Epoch 85 +2024-05-30 09:55:15,259 - INFO - [0/363571] Loss : 1.5347 +2024-05-30 09:55:21,115 - INFO - Test loss : 0.8279 +2024-05-30 09:55:21,115 - INFO - +Epoch 86 +2024-05-30 09:55:22,273 - INFO - [0/363571] Loss : 1.5338 +2024-05-30 09:55:28,100 - INFO - Test loss : 0.8322 +2024-05-30 09:55:28,100 - INFO - +Epoch 87 +2024-05-30 09:55:29,196 - INFO - [0/363571] Loss : 1.5319 +2024-05-30 09:55:35,043 - INFO - Test loss : 0.8404 +2024-05-30 09:55:35,043 - INFO - +Epoch 88 +2024-05-30 09:55:36,197 - INFO - [0/363571] Loss : 1.5300 +2024-05-30 09:55:42,048 - INFO - Test loss : 0.8455 +2024-05-30 09:55:42,048 - INFO - +Epoch 89 +2024-05-30 09:55:43,136 - INFO - [0/363571] Loss : 1.5488 +2024-05-30 09:55:48,973 - INFO - Test loss : 0.8417 +2024-05-30 09:55:48,973 - INFO - +Epoch 90 +2024-05-30 09:55:50,127 - INFO - [0/363571] Loss : 1.5220 +2024-05-30 09:55:55,911 - INFO - Test loss : 0.8349 +2024-05-30 09:55:59,858 - INFO - (53041,) +2024-05-30 09:55:59,998 - INFO - Split ID: 0 +2024-05-30 09:56:00,017 - INFO - Top 1 LocEnc (Epoch 90)acc (%): 8.2 +2024-05-30 09:56:00,019 - INFO - Top 3 LocEnc (Epoch 90)acc (%): 22.72 +2024-05-30 09:56:00,020 - INFO - Top 5 LocEnc (Epoch 90)acc (%): 31.12 +2024-05-30 09:56:00,022 - INFO - Top 10 LocEnc (Epoch 90)acc (%): 43.65 +2024-05-30 09:56:00,032 - INFO - +No prior +2024-05-30 09:56:00,042 - INFO - (53041,) +2024-05-30 09:56:00,192 - INFO - Split ID: 0 +2024-05-30 09:56:00,193 - INFO - Top 1 (Epoch 90)acc (%): 69.83 +2024-05-30 09:56:00,193 - INFO - Top 3 (Epoch 90)acc (%): 84.61 +2024-05-30 09:56:00,193 - INFO - Top 5 (Epoch 90)acc (%): 89.23 +2024-05-30 09:56:00,193 - INFO - Top 10 (Epoch 90)acc (%): 94.29 +2024-05-30 09:56:23,869 - INFO - Split ID: 0 +2024-05-30 09:56:23,869 - INFO - Top 1 (Epoch 90)acc (%): 70.27 +2024-05-30 09:56:23,869 - INFO - Top 3 (Epoch 90)acc (%): 84.96 +2024-05-30 09:56:23,870 - INFO - Top 5 (Epoch 90)acc (%): 89.56 +2024-05-30 09:56:23,870 - INFO - Top 10 (Epoch 90)acc (%): 94.47 +2024-05-30 09:56:23,872 - INFO - +Epoch 91 +2024-05-30 09:56:24,961 - INFO - [0/363571] Loss : 1.5432 +2024-05-30 09:56:30,855 - INFO - Test loss : 0.8271 +2024-05-30 09:56:30,855 - INFO - +Epoch 92 +2024-05-30 09:56:31,916 - INFO - [0/363571] Loss : 1.5330 +2024-05-30 09:56:37,804 - INFO - Test loss : 0.8237 +2024-05-30 09:56:37,804 - INFO - +Epoch 93 +2024-05-30 09:56:38,871 - INFO - [0/363571] Loss : 1.5372 +2024-05-30 09:56:44,698 - INFO - Test loss : 0.8199 +2024-05-30 09:56:44,698 - INFO - +Epoch 94 +2024-05-30 09:56:45,783 - INFO - [0/363571] Loss : 1.5228 +2024-05-30 09:56:51,597 - INFO - Test loss : 0.8211 +2024-05-30 09:56:51,598 - INFO - +Epoch 95 +2024-05-30 09:56:52,743 - INFO - [0/363571] Loss : 1.5375 +2024-05-30 09:56:58,530 - INFO - Test loss : 0.8223 +2024-05-30 09:56:58,530 - INFO - +Epoch 96 +2024-05-30 09:56:59,599 - INFO - [0/363571] Loss : 1.5249 +2024-05-30 09:57:05,403 - INFO - Test loss : 0.8269 +2024-05-30 09:57:05,403 - INFO - +Epoch 97 +2024-05-30 09:57:06,549 - INFO - [0/363571] Loss : 1.5316 +2024-05-30 09:57:12,346 - INFO - Test loss : 0.8309 +2024-05-30 09:57:12,346 - INFO - +Epoch 98 +2024-05-30 09:57:13,411 - INFO - [0/363571] Loss : 1.5227 +2024-05-30 09:57:19,211 - INFO - Test loss : 0.8339 +2024-05-30 09:57:19,211 - INFO - +Epoch 99 +2024-05-30 09:57:20,276 - INFO - [0/363571] Loss : 1.5284 +2024-05-30 09:57:26,161 - INFO - Test loss : 0.8333 +2024-05-30 09:57:26,161 - INFO - +Epoch 100 +2024-05-30 09:57:27,231 - INFO - [0/363571] Loss : 1.5365 +2024-05-30 09:57:33,109 - INFO - Test loss : 0.8300 +2024-05-30 09:57:37,074 - INFO - (53041,) +2024-05-30 09:57:37,217 - INFO - Split ID: 0 +2024-05-30 09:57:37,238 - INFO - Top 1 LocEnc (Epoch 100)acc (%): 9.58 +2024-05-30 09:57:37,240 - INFO - Top 3 LocEnc (Epoch 100)acc (%): 22.83 +2024-05-30 09:57:37,242 - INFO - Top 5 LocEnc (Epoch 100)acc (%): 30.29 +2024-05-30 09:57:37,243 - INFO - Top 10 LocEnc (Epoch 100)acc (%): 44.48 +2024-05-30 09:57:37,255 - INFO - +No prior +2024-05-30 09:57:37,265 - INFO - (53041,) +2024-05-30 09:57:37,422 - INFO - Split ID: 0 +2024-05-30 09:57:37,422 - INFO - Top 1 (Epoch 100)acc (%): 69.83 +2024-05-30 09:57:37,423 - INFO - Top 3 (Epoch 100)acc (%): 84.61 +2024-05-30 09:57:37,423 - INFO - Top 5 (Epoch 100)acc (%): 89.23 +2024-05-30 09:57:37,423 - INFO - Top 10 (Epoch 100)acc (%): 94.29 +2024-05-30 09:58:01,056 - INFO - Split ID: 0 +2024-05-30 09:58:01,056 - INFO - Top 1 (Epoch 100)acc (%): 70.26 +2024-05-30 09:58:01,057 - INFO - Top 3 (Epoch 100)acc (%): 84.98 +2024-05-30 09:58:01,057 - INFO - Top 5 (Epoch 100)acc (%): 89.55 +2024-05-30 09:58:01,057 - INFO - Top 10 (Epoch 100)acc (%): 94.48 +2024-05-30 09:58:01,059 - INFO - +Epoch 101 +2024-05-30 09:58:02,148 - INFO - [0/363571] Loss : 1.5253 +2024-05-30 09:58:08,036 - INFO - Test loss : 0.8272 +2024-05-30 09:58:08,037 - INFO - +Epoch 102 +2024-05-30 09:58:09,211 - INFO - [0/363571] Loss : 1.5231 +2024-05-30 09:58:15,071 - INFO - Test loss : 0.8264 +2024-05-30 09:58:15,071 - INFO - +Epoch 103 +2024-05-30 09:58:16,167 - INFO - [0/363571] Loss : 1.5242 +2024-05-30 09:58:22,061 - INFO - Test loss : 0.8300 +2024-05-30 09:58:22,061 - INFO - +Epoch 104 +2024-05-30 09:58:23,235 - INFO - [0/363571] Loss : 1.5228 +2024-05-30 09:58:29,128 - INFO - Test loss : 0.8352 +2024-05-30 09:58:29,128 - INFO - +Epoch 105 +2024-05-30 09:58:30,213 - INFO - [0/363571] Loss : 1.5380 +2024-05-30 09:58:36,070 - INFO - Test loss : 0.8353 +2024-05-30 09:58:36,070 - INFO - +Epoch 106 +2024-05-30 09:58:37,240 - INFO - [0/363571] Loss : 1.5274 +2024-05-30 09:58:43,129 - INFO - Test loss : 0.8370 +2024-05-30 09:58:43,129 - INFO - +Epoch 107 +2024-05-30 09:58:44,223 - INFO - [0/363571] Loss : 1.5314 +2024-05-30 09:58:50,108 - INFO - Test loss : 0.8341 +2024-05-30 09:58:50,108 - INFO - +Epoch 108 +2024-05-30 09:58:51,201 - INFO - [0/363571] Loss : 1.5391 +2024-05-30 09:58:57,177 - INFO - Test loss : 0.8290 +2024-05-30 09:58:57,177 - INFO - +Epoch 109 +2024-05-30 09:58:58,265 - INFO - [0/363571] Loss : 1.5225 +2024-05-30 09:59:04,147 - INFO - Test loss : 0.8257 +2024-05-30 09:59:04,147 - INFO - +Epoch 110 +2024-05-30 09:59:05,233 - INFO - [0/363571] Loss : 1.5140 +2024-05-30 09:59:11,108 - INFO - Test loss : 0.8245 +2024-05-30 09:59:15,016 - INFO - (53041,) +2024-05-30 09:59:15,156 - INFO - Split ID: 0 +2024-05-30 09:59:15,174 - INFO - Top 1 LocEnc (Epoch 110)acc (%): 9.68 +2024-05-30 09:59:15,176 - INFO - Top 3 LocEnc (Epoch 110)acc (%): 24.23 +2024-05-30 09:59:15,178 - INFO - Top 5 LocEnc (Epoch 110)acc (%): 31.75 +2024-05-30 09:59:15,180 - INFO - Top 10 LocEnc (Epoch 110)acc (%): 45.89 +2024-05-30 09:59:15,191 - INFO - +No prior +2024-05-30 09:59:15,200 - INFO - (53041,) +2024-05-30 09:59:15,353 - INFO - Split ID: 0 +2024-05-30 09:59:15,353 - INFO - Top 1 (Epoch 110)acc (%): 69.83 +2024-05-30 09:59:15,353 - INFO - Top 3 (Epoch 110)acc (%): 84.61 +2024-05-30 09:59:15,353 - INFO - Top 5 (Epoch 110)acc (%): 89.23 +2024-05-30 09:59:15,353 - INFO - Top 10 (Epoch 110)acc (%): 94.29 +2024-05-30 09:59:39,122 - INFO - Split ID: 0 +2024-05-30 09:59:39,122 - INFO - Top 1 (Epoch 110)acc (%): 70.29 +2024-05-30 09:59:39,123 - INFO - Top 3 (Epoch 110)acc (%): 84.98 +2024-05-30 09:59:39,123 - INFO - Top 5 (Epoch 110)acc (%): 89.58 +2024-05-30 09:59:39,123 - INFO - Top 10 (Epoch 110)acc (%): 94.49 +2024-05-30 09:59:39,125 - INFO - +Epoch 111 +2024-05-30 09:59:40,312 - INFO - [0/363571] Loss : 1.5238 +2024-05-30 09:59:46,188 - INFO - Test loss : 0.8249 +2024-05-30 09:59:46,188 - INFO - +Epoch 112 +2024-05-30 09:59:47,286 - INFO - [0/363571] Loss : 1.5132 +2024-05-30 09:59:53,183 - INFO - Test loss : 0.8293 +2024-05-30 09:59:53,183 - INFO - +Epoch 113 +2024-05-30 09:59:54,347 - INFO - [0/363571] Loss : 1.5351 +2024-05-30 10:00:00,188 - INFO - Test loss : 0.8342 +2024-05-30 10:00:00,188 - INFO - +Epoch 114 +2024-05-30 10:00:01,304 - INFO - [0/363571] Loss : 1.5207 +2024-05-30 10:00:07,145 - INFO - Test loss : 0.8382 +2024-05-30 10:00:07,145 - INFO - +Epoch 115 +2024-05-30 10:00:08,220 - INFO - [0/363571] Loss : 1.5226 +2024-05-30 10:00:14,142 - INFO - Test loss : 0.8407 +2024-05-30 10:00:14,143 - INFO - +Epoch 116 +2024-05-30 10:00:15,224 - INFO - [0/363571] Loss : 1.5260 +2024-05-30 10:00:21,080 - INFO - Test loss : 0.8395 +2024-05-30 10:00:21,080 - INFO - +Epoch 117 +2024-05-30 10:00:22,141 - INFO - [0/363571] Loss : 1.5179 +2024-05-30 10:00:27,938 - INFO - Test loss : 0.8359 +2024-05-30 10:00:27,938 - INFO - +Epoch 118 +2024-05-30 10:00:29,091 - INFO - [0/363571] Loss : 1.5246 +2024-05-30 10:00:34,854 - INFO - Test loss : 0.8317 +2024-05-30 10:00:34,854 - INFO - +Epoch 119 +2024-05-30 10:00:35,915 - INFO - [0/363571] Loss : 1.5131 +2024-05-30 10:00:41,695 - INFO - Test loss : 0.8280 +2024-05-30 10:00:41,695 - INFO - Saving output model to ../models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-30 10:00:41,855 - INFO - Saving output model to ../models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-30 10:00:42,542 - INFO - +No prior +2024-05-30 10:00:42,550 - INFO - (53041,) +2024-05-30 10:00:42,717 - INFO - Split ID: 0 +2024-05-30 10:00:42,717 - INFO - Top 1 acc (%): 69.83 +2024-05-30 10:00:42,717 - INFO - Top 3 acc (%): 84.61 +2024-05-30 10:00:42,717 - INFO - Top 5 acc (%): 89.23 +2024-05-30 10:00:42,717 - INFO - Top 10 acc (%): 94.29 +2024-05-30 10:01:06,594 - INFO - Split ID: 0 +2024-05-30 10:01:06,594 - INFO - Top 1 acc (%): 70.27 +2024-05-30 10:01:06,594 - INFO - Top 3 acc (%): 84.97 +2024-05-30 10:01:06,594 - INFO - Top 5 acc (%): 89.56 +2024-05-30 10:01:06,595 - INFO - Top 10 acc (%): 94.48 +2024-05-30 10:01:06,600 - INFO - +Sphere2Vec-dfs +2024-05-30 10:01:06,600 - INFO - Model : model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-30 10:01:10,553 - INFO - (53041,) +2024-05-30 10:01:10,696 - INFO - Split ID: 0 +2024-05-30 10:01:10,714 - INFO - Top 1 LocEnc acc (%): 9.46 +2024-05-30 10:01:10,716 - INFO - Top 3 LocEnc acc (%): 24.8 +2024-05-30 10:01:10,718 - INFO - Top 5 LocEnc acc (%): 31.75 +2024-05-30 10:01:10,719 - INFO - Top 10 LocEnc acc (%): 45.39 +2024-05-31 04:07:25,094 - INFO - +num_classes 62 +2024-05-31 04:07:25,094 - INFO - num train 363571 +2024-05-31 04:07:25,094 - INFO - num val 53041 +2024-05-31 04:07:25,094 - INFO - train loss full_loss +2024-05-31 04:07:25,095 - INFO - model name ../models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 04:07:25,095 - INFO - num users 1 +2024-05-31 04:07:26,085 - INFO - +Only Sphere2Vec-dfs +2024-05-31 04:07:26,086 - INFO - Model : model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 04:07:26,163 - INFO - +Epoch 121 +2024-05-31 04:07:27,951 - INFO - [0/363571] Loss : 1.5172 +2024-05-31 04:07:34,435 - INFO - Test loss : 0.8244 +2024-05-31 04:07:34,435 - INFO - +Epoch 122 +2024-05-31 04:07:35,633 - INFO - [0/363571] Loss : 1.5189 +2024-05-31 04:07:42,159 - INFO - Test loss : 0.8233 +2024-05-31 04:07:42,159 - INFO - +Epoch 123 +2024-05-31 04:07:43,373 - INFO - [0/363571] Loss : 1.5198 +2024-05-31 04:07:49,836 - INFO - Test loss : 0.8238 +2024-05-31 04:07:49,836 - INFO - +Epoch 124 +2024-05-31 04:07:51,065 - INFO - [0/363571] Loss : 1.5306 +2024-05-31 04:07:57,576 - INFO - Test loss : 0.8255 +2024-05-31 04:07:57,577 - INFO - +Epoch 125 +2024-05-31 04:07:58,909 - INFO - [0/363571] Loss : 1.5138 +2024-05-31 04:08:19,650 - INFO - +num_classes 62 +2024-05-31 04:08:19,650 - INFO - num train 363571 +2024-05-31 04:08:19,651 - INFO - num val 53041 +2024-05-31 04:08:19,651 - INFO - train loss full_loss +2024-05-31 04:08:19,651 - INFO - model name ../models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 04:08:19,651 - INFO - num users 1 +2024-05-31 04:08:20,636 - INFO - +Only Sphere2Vec-dfs +2024-05-31 04:08:20,636 - INFO - Model : model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 04:08:20,720 - INFO - Saving output model to ../models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 04:08:21,730 - INFO - Saving output model to ../models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 04:08:22,560 - INFO - +No prior +2024-05-31 04:08:22,579 - INFO - (53041,) +2024-05-31 04:08:23,539 - INFO - Save results to ../eval_results/eval_fmow__val_no_prior.csv +2024-05-31 04:08:23,540 - INFO - Split ID: 0 +2024-05-31 04:08:23,540 - INFO - Top 1 acc (%): 69.83 +2024-05-31 04:08:23,540 - INFO - Top 3 acc (%): 84.61 +2024-05-31 04:08:23,540 - INFO - Top 5 acc (%): 89.23 +2024-05-31 04:08:23,540 - INFO - Top 10 acc (%): 94.29 +2024-05-31 04:08:50,917 - INFO - Split ID: 0 +2024-05-31 04:08:50,917 - INFO - Top 1 hit (%): 70.27 +2024-05-31 04:08:50,917 - INFO - Top 3 hit (%): 84.97 +2024-05-31 04:08:50,917 - INFO - Top 5 hit (%): 89.56 +2024-05-31 04:08:50,918 - INFO - Top 10 hit (%): 94.48 +2024-05-31 04:08:50,932 - INFO - +Only Sphere2Vec-dfs +2024-05-31 04:08:50,932 - INFO - Model : model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar +2024-05-31 04:08:55,278 - INFO - (53041,) +2024-05-31 04:08:55,420 - INFO - Split ID: 0 +2024-05-31 04:08:55,438 - INFO - Top 1 LocEnc acc (%): 9.46 +2024-05-31 04:08:55,440 - INFO - Top 3 LocEnc acc (%): 24.8 +2024-05-31 04:08:55,441 - INFO - Top 5 LocEnc acc (%): 31.75 +2024-05-31 04:08:55,443 - INFO - Top 10 LocEnc acc (%): 45.39 diff --git a/pre_trained_models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar b/pre_trained_models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar new file mode 100755 index 00000000..1d972e24 Binary files /dev/null and b/pre_trained_models/sphere2vec_dfs/model_fmow_Sphere2Vec-dfs_inception_v3_0.0100_64_0.0100000_1.000_1_512_BATCH8192_leakyrelu.pth.tar differ diff --git a/pre_trained_models/sphere2vec_dfs/model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.log b/pre_trained_models/sphere2vec_dfs/model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..34b368ed --- /dev/null +++ b/pre_trained_models/sphere2vec_dfs/model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.log @@ -0,0 +1,737 @@ +2024-05-30 03:04:07,203 - INFO - +num_classes 5089 +2024-05-30 03:04:07,204 - INFO - num train 569465 +2024-05-30 03:04:07,204 - INFO - num val 93622 +2024-05-30 03:04:07,204 - INFO - train loss full_loss +2024-05-30 03:04:07,204 - INFO - model name ../models/sphere2vec_dfs/model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-30 03:04:07,204 - INFO - num users 17302 +2024-05-30 03:04:08,081 - INFO - +Epoch 0 +2024-05-30 03:04:20,524 - INFO - [204800/569465] Loss : 1.3484 +2024-05-30 03:04:23,921 - INFO - [266240/569465] Loss : 1.2762 +2024-05-30 03:04:28,554 - INFO - Test loss : 0.3794 +2024-05-30 03:04:28,554 - INFO - +Epoch 1 +2024-05-30 03:04:40,460 - INFO - [204800/569465] Loss : 0.9036 +2024-05-30 03:04:43,843 - INFO - [266240/569465] Loss : 0.8888 +2024-05-30 03:04:48,440 - INFO - Test loss : 0.3669 +2024-05-30 03:04:48,440 - INFO - +Epoch 2 +2024-05-30 03:05:00,335 - INFO - [204800/569465] Loss : 0.7958 +2024-05-30 03:05:03,706 - INFO - [266240/569465] Loss : 0.7905 +2024-05-30 03:05:08,303 - INFO - Test loss : 0.3406 +2024-05-30 03:05:08,303 - INFO - +Epoch 3 +2024-05-30 03:05:20,262 - INFO - [204800/569465] Loss : 0.7473 +2024-05-30 03:05:23,663 - INFO - [266240/569465] Loss : 0.7463 +2024-05-30 03:05:28,309 - INFO - Test loss : 0.3388 +2024-05-30 03:05:28,309 - INFO - +Epoch 4 +2024-05-30 03:05:40,274 - INFO - [204800/569465] Loss : 0.7227 +2024-05-30 03:05:43,683 - INFO - [266240/569465] Loss : 0.7213 +2024-05-30 03:05:48,302 - INFO - Test loss : 0.3771 +2024-05-30 03:05:48,302 - INFO - +Epoch 5 +2024-05-30 03:06:00,218 - INFO - [204800/569465] Loss : 0.7015 +2024-05-30 03:06:03,507 - INFO - [266240/569465] Loss : 0.7008 +2024-05-30 03:06:08,208 - INFO - Test loss : 0.3803 +2024-05-30 03:06:08,208 - INFO - +Epoch 6 +2024-05-30 03:06:20,117 - INFO - [204800/569465] Loss : 0.6873 +2024-05-30 03:06:23,514 - INFO - [266240/569465] Loss : 0.6875 +2024-05-30 03:06:28,191 - INFO - Test loss : 0.3760 +2024-05-30 03:06:28,191 - INFO - +Epoch 7 +2024-05-30 03:06:40,172 - INFO - [204800/569465] Loss : 0.6751 +2024-05-30 03:06:43,564 - INFO - [266240/569465] Loss : 0.6752 +2024-05-30 03:06:48,181 - INFO - Test loss : 0.3822 +2024-05-30 03:06:48,181 - INFO - +Epoch 8 +2024-05-30 03:07:00,098 - INFO - [204800/569465] Loss : 0.6658 +2024-05-30 03:07:03,472 - INFO - [266240/569465] Loss : 0.6662 +2024-05-30 03:07:08,081 - INFO - Test loss : 0.4039 +2024-05-30 03:07:08,081 - INFO - +Epoch 9 +2024-05-30 03:07:19,972 - INFO - [204800/569465] Loss : 0.6588 +2024-05-30 03:07:23,369 - INFO - [266240/569465] Loss : 0.6583 +2024-05-30 03:07:27,981 - INFO - Test loss : 0.4049 +2024-05-30 03:07:27,981 - INFO - +Epoch 10 +2024-05-30 03:07:39,913 - INFO - [204800/569465] Loss : 0.6514 +2024-05-30 03:07:43,218 - INFO - [266240/569465] Loss : 0.6515 +2024-05-30 03:07:47,915 - INFO - Test loss : 0.4205 +2024-05-30 03:07:49,453 - INFO - (93622,) +2024-05-30 03:09:21,213 - INFO - Split ID: 0 +2024-05-30 03:09:21,251 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 0.54 +2024-05-30 03:09:21,255 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 1.46 +2024-05-30 03:09:21,258 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 2.28 +2024-05-30 03:09:21,261 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 4.39 +2024-05-30 03:09:21,513 - INFO - +No prior +2024-05-30 03:09:22,072 - INFO - (95986,) +2024-05-30 03:09:45,957 - INFO - Split ID: 0 +2024-05-30 03:09:45,957 - INFO - Top 1 (Epoch 10)acc (%): 63.27 +2024-05-30 03:09:45,957 - INFO - Top 3 (Epoch 10)acc (%): 79.82 +2024-05-30 03:09:45,958 - INFO - Top 5 (Epoch 10)acc (%): 84.51 +2024-05-30 03:09:45,958 - INFO - Top 10 (Epoch 10)acc (%): 88.99 +2024-05-30 03:10:32,952 - INFO - Split ID: 0 +2024-05-30 03:10:32,953 - INFO - Top 1 (Epoch 10)acc (%): 69.1 +2024-05-30 03:10:32,953 - INFO - Top 3 (Epoch 10)acc (%): 84.35 +2024-05-30 03:10:32,953 - INFO - Top 5 (Epoch 10)acc (%): 88.17 +2024-05-30 03:10:32,953 - INFO - Top 10 (Epoch 10)acc (%): 91.77 +2024-05-30 03:10:32,955 - INFO - +Epoch 11 +2024-05-30 03:10:44,882 - INFO - [204800/569465] Loss : 0.6431 +2024-05-30 03:10:48,159 - INFO - [266240/569465] Loss : 0.6445 +2024-05-30 03:10:52,843 - INFO - Test loss : 0.4118 +2024-05-30 03:10:52,843 - INFO - +Epoch 12 +2024-05-30 03:11:04,696 - INFO - [204800/569465] Loss : 0.6399 +2024-05-30 03:11:07,982 - INFO - [266240/569465] Loss : 0.6406 +2024-05-30 03:11:12,666 - INFO - Test loss : 0.4749 +2024-05-30 03:11:12,666 - INFO - +Epoch 13 +2024-05-30 03:11:24,539 - INFO - [204800/569465] Loss : 0.6356 +2024-05-30 03:11:27,827 - INFO - [266240/569465] Loss : 0.6364 +2024-05-30 03:11:32,535 - INFO - Test loss : 0.4554 +2024-05-30 03:11:32,535 - INFO - +Epoch 14 +2024-05-30 03:11:44,436 - INFO - [204800/569465] Loss : 0.6291 +2024-05-30 03:11:47,726 - INFO - [266240/569465] Loss : 0.6299 +2024-05-30 03:11:52,333 - INFO - Test loss : 0.4356 +2024-05-30 03:11:52,334 - INFO - +Epoch 15 +2024-05-30 03:12:04,244 - INFO - [204800/569465] Loss : 0.6257 +2024-05-30 03:12:07,630 - INFO - [266240/569465] Loss : 0.6271 +2024-05-30 03:12:12,221 - INFO - Test loss : 0.4574 +2024-05-30 03:12:12,221 - INFO - +Epoch 16 +2024-05-30 03:12:24,118 - INFO - [204800/569465] Loss : 0.6234 +2024-05-30 03:12:27,519 - INFO - [266240/569465] Loss : 0.6245 +2024-05-30 03:12:32,122 - INFO - Test loss : 0.4399 +2024-05-30 03:12:32,123 - INFO - +Epoch 17 +2024-05-30 03:12:44,016 - INFO - [204800/569465] Loss : 0.6189 +2024-05-30 03:12:47,388 - INFO - [266240/569465] Loss : 0.6200 +2024-05-30 03:12:51,969 - INFO - Test loss : 0.4524 +2024-05-30 03:12:51,970 - INFO - +Epoch 18 +2024-05-30 03:13:03,810 - INFO - [204800/569465] Loss : 0.6164 +2024-05-30 03:13:07,163 - INFO - [266240/569465] Loss : 0.6171 +2024-05-30 03:13:11,755 - INFO - Test loss : 0.4841 +2024-05-30 03:13:11,755 - INFO - +Epoch 19 +2024-05-30 03:13:23,648 - INFO - [204800/569465] Loss : 0.6124 +2024-05-30 03:13:27,028 - INFO - [266240/569465] Loss : 0.6136 +2024-05-30 03:13:31,684 - INFO - Test loss : 0.4904 +2024-05-30 03:13:31,684 - INFO - +Epoch 20 +2024-05-30 03:13:43,626 - INFO - [204800/569465] Loss : 0.6108 +2024-05-30 03:13:47,052 - INFO - [266240/569465] Loss : 0.6113 +2024-05-30 03:13:51,665 - INFO - Test loss : 0.4753 +2024-05-30 03:13:53,113 - INFO - (93622,) +2024-05-30 03:14:39,596 - INFO - Split ID: 0 +2024-05-30 03:14:39,634 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 0.6 +2024-05-30 03:14:39,637 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 1.65 +2024-05-30 03:14:39,640 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 2.72 +2024-05-30 03:14:39,643 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 5.01 +2024-05-30 03:14:39,954 - INFO - +No prior +2024-05-30 03:14:40,565 - INFO - (95986,) +2024-05-30 03:14:58,847 - INFO - Split ID: 0 +2024-05-30 03:14:58,848 - INFO - Top 1 (Epoch 20)acc (%): 63.27 +2024-05-30 03:14:58,848 - INFO - Top 3 (Epoch 20)acc (%): 79.82 +2024-05-30 03:14:58,848 - INFO - Top 5 (Epoch 20)acc (%): 84.51 +2024-05-30 03:14:58,848 - INFO - Top 10 (Epoch 20)acc (%): 88.99 +2024-05-30 03:15:46,025 - INFO - Split ID: 0 +2024-05-30 03:15:46,025 - INFO - Top 1 (Epoch 20)acc (%): 69.28 +2024-05-30 03:15:46,025 - INFO - Top 3 (Epoch 20)acc (%): 84.41 +2024-05-30 03:15:46,025 - INFO - Top 5 (Epoch 20)acc (%): 88.14 +2024-05-30 03:15:46,026 - INFO - Top 10 (Epoch 20)acc (%): 91.74 +2024-05-30 03:15:46,055 - INFO - +Epoch 21 +2024-05-30 03:15:58,129 - INFO - [204800/569465] Loss : 0.6089 +2024-05-30 03:16:01,567 - INFO - [266240/569465] Loss : 0.6096 +2024-05-30 03:16:06,226 - INFO - Test loss : 0.4831 +2024-05-30 03:16:06,226 - INFO - +Epoch 22 +2024-05-30 03:16:18,261 - INFO - [204800/569465] Loss : 0.6050 +2024-05-30 03:16:21,604 - INFO - [266240/569465] Loss : 0.6059 +2024-05-30 03:16:26,337 - INFO - Test loss : 0.4797 +2024-05-30 03:16:26,337 - INFO - +Epoch 23 +2024-05-30 03:16:38,375 - INFO - [204800/569465] Loss : 0.6025 +2024-05-30 03:16:41,814 - INFO - [266240/569465] Loss : 0.6035 +2024-05-30 03:16:46,461 - INFO - Test loss : 0.4878 +2024-05-30 03:16:46,461 - INFO - +Epoch 24 +2024-05-30 03:16:58,471 - INFO - [204800/569465] Loss : 0.6017 +2024-05-30 03:17:01,892 - INFO - [266240/569465] Loss : 0.6029 +2024-05-30 03:17:06,536 - INFO - Test loss : 0.4900 +2024-05-30 03:17:06,536 - INFO - +Epoch 25 +2024-05-30 03:17:18,591 - INFO - [204800/569465] Loss : 0.5995 +2024-05-30 03:17:22,031 - INFO - [266240/569465] Loss : 0.6003 +2024-05-30 03:17:26,717 - INFO - Test loss : 0.5103 +2024-05-30 03:17:26,717 - INFO - +Epoch 26 +2024-05-30 03:17:38,781 - INFO - [204800/569465] Loss : 0.5964 +2024-05-30 03:17:42,195 - INFO - [266240/569465] Loss : 0.5976 +2024-05-30 03:17:46,871 - INFO - Test loss : 0.4852 +2024-05-30 03:17:46,872 - INFO - +Epoch 27 +2024-05-30 03:17:58,921 - INFO - [204800/569465] Loss : 0.5955 +2024-05-30 03:18:02,249 - INFO - [266240/569465] Loss : 0.5962 +2024-05-30 03:18:07,020 - INFO - Test loss : 0.5131 +2024-05-30 03:18:07,020 - INFO - +Epoch 28 +2024-05-30 03:18:19,035 - INFO - [204800/569465] Loss : 0.5930 +2024-05-30 03:18:22,375 - INFO - [266240/569465] Loss : 0.5940 +2024-05-30 03:18:27,133 - INFO - Test loss : 0.5258 +2024-05-30 03:18:27,133 - INFO - +Epoch 29 +2024-05-30 03:18:39,109 - INFO - [204800/569465] Loss : 0.5919 +2024-05-30 03:18:42,440 - INFO - [266240/569465] Loss : 0.5931 +2024-05-30 03:18:47,203 - INFO - Test loss : 0.5220 +2024-05-30 03:18:47,203 - INFO - +Epoch 30 +2024-05-30 03:18:59,214 - INFO - [204800/569465] Loss : 0.5917 +2024-05-30 03:19:02,548 - INFO - [266240/569465] Loss : 0.5925 +2024-05-30 03:19:07,326 - INFO - Test loss : 0.5315 +2024-05-30 03:19:08,861 - INFO - (93622,) +2024-05-30 03:19:55,569 - INFO - Split ID: 0 +2024-05-30 03:19:55,608 - INFO - Top 1 LocEnc (Epoch 30)acc (%): 0.69 +2024-05-30 03:19:55,611 - INFO - Top 3 LocEnc (Epoch 30)acc (%): 1.88 +2024-05-30 03:19:55,614 - INFO - Top 5 LocEnc (Epoch 30)acc (%): 2.95 +2024-05-30 03:19:55,617 - INFO - Top 10 LocEnc (Epoch 30)acc (%): 5.17 +2024-05-30 03:19:55,893 - INFO - +No prior +2024-05-30 03:19:56,509 - INFO - (95986,) +2024-05-30 03:20:14,784 - INFO - Split ID: 0 +2024-05-30 03:20:14,784 - INFO - Top 1 (Epoch 30)acc (%): 63.27 +2024-05-30 03:20:14,784 - INFO - Top 3 (Epoch 30)acc (%): 79.82 +2024-05-30 03:20:14,785 - INFO - Top 5 (Epoch 30)acc (%): 84.51 +2024-05-30 03:20:14,785 - INFO - Top 10 (Epoch 30)acc (%): 88.99 +2024-05-30 03:21:02,054 - INFO - Split ID: 0 +2024-05-30 03:21:02,054 - INFO - Top 1 (Epoch 30)acc (%): 69.42 +2024-05-30 03:21:02,054 - INFO - Top 3 (Epoch 30)acc (%): 84.44 +2024-05-30 03:21:02,054 - INFO - Top 5 (Epoch 30)acc (%): 88.14 +2024-05-30 03:21:02,054 - INFO - Top 10 (Epoch 30)acc (%): 91.72 +2024-05-30 03:21:02,056 - INFO - +Epoch 31 +2024-05-30 03:21:14,141 - INFO - [204800/569465] Loss : 0.5892 +2024-05-30 03:21:17,460 - INFO - [266240/569465] Loss : 0.5901 +2024-05-30 03:21:22,123 - INFO - Test loss : 0.5264 +2024-05-30 03:21:22,123 - INFO - +Epoch 32 +2024-05-30 03:21:34,137 - INFO - [204800/569465] Loss : 0.5884 +2024-05-30 03:21:37,539 - INFO - [266240/569465] Loss : 0.5892 +2024-05-30 03:21:42,257 - INFO - Test loss : 0.5232 +2024-05-30 03:21:42,257 - INFO - +Epoch 33 +2024-05-30 03:21:54,344 - INFO - [204800/569465] Loss : 0.5865 +2024-05-30 03:21:57,759 - INFO - [266240/569465] Loss : 0.5876 +2024-05-30 03:22:02,402 - INFO - Test loss : 0.5449 +2024-05-30 03:22:02,402 - INFO - +Epoch 34 +2024-05-30 03:22:14,451 - INFO - [204800/569465] Loss : 0.5844 +2024-05-30 03:22:17,869 - INFO - [266240/569465] Loss : 0.5854 +2024-05-30 03:22:22,519 - INFO - Test loss : 0.5275 +2024-05-30 03:22:22,519 - INFO - +Epoch 35 +2024-05-30 03:22:34,566 - INFO - [204800/569465] Loss : 0.5844 +2024-05-30 03:22:38,008 - INFO - [266240/569465] Loss : 0.5853 +2024-05-30 03:22:42,709 - INFO - Test loss : 0.5274 +2024-05-30 03:22:42,709 - INFO - +Epoch 36 +2024-05-30 03:22:54,635 - INFO - [204800/569465] Loss : 0.5818 +2024-05-30 03:22:57,995 - INFO - [266240/569465] Loss : 0.5832 +2024-05-30 03:23:02,577 - INFO - Test loss : 0.5451 +2024-05-30 03:23:02,577 - INFO - +Epoch 37 +2024-05-30 03:23:14,430 - INFO - [204800/569465] Loss : 0.5813 +2024-05-30 03:23:17,805 - INFO - [266240/569465] Loss : 0.5824 +2024-05-30 03:23:22,396 - INFO - Test loss : 0.5331 +2024-05-30 03:23:22,396 - INFO - +Epoch 38 +2024-05-30 03:23:34,251 - INFO - [204800/569465] Loss : 0.5801 +2024-05-30 03:23:37,634 - INFO - [266240/569465] Loss : 0.5814 +2024-05-30 03:23:42,275 - INFO - Test loss : 0.5505 +2024-05-30 03:23:42,275 - INFO - +Epoch 39 +2024-05-30 03:23:54,267 - INFO - [204800/569465] Loss : 0.5804 +2024-05-30 03:23:57,627 - INFO - [266240/569465] Loss : 0.5812 +2024-05-30 03:24:02,371 - INFO - Test loss : 0.5384 +2024-05-30 03:24:02,371 - INFO - +Epoch 40 +2024-05-30 03:24:14,440 - INFO - [204800/569465] Loss : 0.5769 +2024-05-30 03:24:17,886 - INFO - [266240/569465] Loss : 0.5780 +2024-05-30 03:24:22,542 - INFO - Test loss : 0.5419 +2024-05-30 03:24:24,091 - INFO - (93622,) +2024-05-30 03:25:10,910 - INFO - Split ID: 0 +2024-05-30 03:25:10,947 - INFO - Top 1 LocEnc (Epoch 40)acc (%): 0.68 +2024-05-30 03:25:10,950 - INFO - Top 3 LocEnc (Epoch 40)acc (%): 2.02 +2024-05-30 03:25:10,954 - INFO - Top 5 LocEnc (Epoch 40)acc (%): 3.1 +2024-05-30 03:25:10,957 - INFO - Top 10 LocEnc (Epoch 40)acc (%): 5.74 +2024-05-30 03:25:11,257 - INFO - +No prior +2024-05-30 03:25:11,908 - INFO - (95986,) +2024-05-30 03:25:30,288 - INFO - Split ID: 0 +2024-05-30 03:25:30,288 - INFO - Top 1 (Epoch 40)acc (%): 63.27 +2024-05-30 03:25:30,288 - INFO - Top 3 (Epoch 40)acc (%): 79.82 +2024-05-30 03:25:30,288 - INFO - Top 5 (Epoch 40)acc (%): 84.51 +2024-05-30 03:25:30,289 - INFO - Top 10 (Epoch 40)acc (%): 88.99 +2024-05-30 03:26:17,622 - INFO - Split ID: 0 +2024-05-30 03:26:17,622 - INFO - Top 1 (Epoch 40)acc (%): 69.39 +2024-05-30 03:26:17,623 - INFO - Top 3 (Epoch 40)acc (%): 84.39 +2024-05-30 03:26:17,623 - INFO - Top 5 (Epoch 40)acc (%): 88.1 +2024-05-30 03:26:17,623 - INFO - Top 10 (Epoch 40)acc (%): 91.71 +2024-05-30 03:26:17,625 - INFO - +Epoch 41 +2024-05-30 03:26:29,654 - INFO - [204800/569465] Loss : 0.5770 +2024-05-30 03:26:33,061 - INFO - [266240/569465] Loss : 0.5778 +2024-05-30 03:26:37,713 - INFO - Test loss : 0.5384 +2024-05-30 03:26:37,713 - INFO - +Epoch 42 +2024-05-30 03:26:49,744 - INFO - [204800/569465] Loss : 0.5761 +2024-05-30 03:26:53,141 - INFO - [266240/569465] Loss : 0.5772 +2024-05-30 03:26:57,789 - INFO - Test loss : 0.5576 +2024-05-30 03:26:57,789 - INFO - +Epoch 43 +2024-05-30 03:27:09,856 - INFO - [204800/569465] Loss : 0.5747 +2024-05-30 03:27:13,310 - INFO - [266240/569465] Loss : 0.5756 +2024-05-30 03:27:18,017 - INFO - Test loss : 0.5578 +2024-05-30 03:27:18,017 - INFO - +Epoch 44 +2024-05-30 03:27:30,046 - INFO - [204800/569465] Loss : 0.5744 +2024-05-30 03:27:33,377 - INFO - [266240/569465] Loss : 0.5753 +2024-05-30 03:27:38,125 - INFO - Test loss : 0.5577 +2024-05-30 03:27:38,125 - INFO - +Epoch 45 +2024-05-30 03:27:50,164 - INFO - [204800/569465] Loss : 0.5725 +2024-05-30 03:27:53,524 - INFO - [266240/569465] Loss : 0.5732 +2024-05-30 03:27:58,289 - INFO - Test loss : 0.5657 +2024-05-30 03:27:58,289 - INFO - +Epoch 46 +2024-05-30 03:28:10,417 - INFO - [204800/569465] Loss : 0.5715 +2024-05-30 03:28:13,733 - INFO - [266240/569465] Loss : 0.5731 +2024-05-30 03:28:18,454 - INFO - Test loss : 0.5604 +2024-05-30 03:28:18,454 - INFO - +Epoch 47 +2024-05-30 03:28:30,528 - INFO - [204800/569465] Loss : 0.5723 +2024-05-30 03:28:33,863 - INFO - [266240/569465] Loss : 0.5734 +2024-05-30 03:28:38,615 - INFO - Test loss : 0.5630 +2024-05-30 03:28:38,616 - INFO - +Epoch 48 +2024-05-30 03:28:50,704 - INFO - [204800/569465] Loss : 0.5717 +2024-05-30 03:28:54,071 - INFO - [266240/569465] Loss : 0.5726 +2024-05-30 03:28:58,791 - INFO - Test loss : 0.5578 +2024-05-30 03:28:58,791 - INFO - +Epoch 49 +2024-05-30 03:29:10,866 - INFO - [204800/569465] Loss : 0.5687 +2024-05-30 03:29:14,279 - INFO - [266240/569465] Loss : 0.5700 +2024-05-30 03:29:18,948 - INFO - Test loss : 0.5684 +2024-05-30 03:29:18,948 - INFO - +Epoch 50 +2024-05-30 03:29:31,071 - INFO - [204800/569465] Loss : 0.5694 +2024-05-30 03:29:34,504 - INFO - [266240/569465] Loss : 0.5701 +2024-05-30 03:29:39,175 - INFO - Test loss : 0.5867 +2024-05-30 03:29:40,685 - INFO - (93622,) +2024-05-30 03:30:27,652 - INFO - Split ID: 0 +2024-05-30 03:30:27,691 - INFO - Top 1 LocEnc (Epoch 50)acc (%): 0.84 +2024-05-30 03:30:27,694 - INFO - Top 3 LocEnc (Epoch 50)acc (%): 2.04 +2024-05-30 03:30:27,697 - INFO - Top 5 LocEnc (Epoch 50)acc (%): 3.1 +2024-05-30 03:30:27,701 - INFO - Top 10 LocEnc (Epoch 50)acc (%): 5.43 +2024-05-30 03:30:28,031 - INFO - +No prior +2024-05-30 03:30:28,680 - INFO - (95986,) +2024-05-30 03:30:47,145 - INFO - Split ID: 0 +2024-05-30 03:30:47,145 - INFO - Top 1 (Epoch 50)acc (%): 63.27 +2024-05-30 03:30:47,145 - INFO - Top 3 (Epoch 50)acc (%): 79.82 +2024-05-30 03:30:47,145 - INFO - Top 5 (Epoch 50)acc (%): 84.51 +2024-05-30 03:30:47,146 - INFO - Top 10 (Epoch 50)acc (%): 88.99 +2024-05-30 03:31:34,827 - INFO - Split ID: 0 +2024-05-30 03:31:34,827 - INFO - Top 1 (Epoch 50)acc (%): 69.39 +2024-05-30 03:31:34,828 - INFO - Top 3 (Epoch 50)acc (%): 84.32 +2024-05-30 03:31:34,828 - INFO - Top 5 (Epoch 50)acc (%): 88.03 +2024-05-30 03:31:34,828 - INFO - Top 10 (Epoch 50)acc (%): 91.64 +2024-05-30 03:31:34,832 - INFO - +Epoch 51 +2024-05-30 03:31:46,890 - INFO - [204800/569465] Loss : 0.5675 +2024-05-30 03:31:50,332 - INFO - [266240/569465] Loss : 0.5687 +2024-05-30 03:31:54,998 - INFO - Test loss : 0.5831 +2024-05-30 03:31:54,998 - INFO - +Epoch 52 +2024-05-30 03:32:06,841 - INFO - [204800/569465] Loss : 0.5674 +2024-05-30 03:32:10,208 - INFO - [266240/569465] Loss : 0.5683 +2024-05-30 03:32:14,813 - INFO - Test loss : 0.5843 +2024-05-30 03:32:14,813 - INFO - +Epoch 53 +2024-05-30 03:32:26,666 - INFO - [204800/569465] Loss : 0.5659 +2024-05-30 03:32:30,040 - INFO - [266240/569465] Loss : 0.5671 +2024-05-30 03:32:34,646 - INFO - Test loss : 0.5723 +2024-05-30 03:32:34,646 - INFO - +Epoch 54 +2024-05-30 03:32:46,583 - INFO - [204800/569465] Loss : 0.5654 +2024-05-30 03:32:50,052 - INFO - [266240/569465] Loss : 0.5659 +2024-05-30 03:32:54,707 - INFO - Test loss : 0.6118 +2024-05-30 03:32:54,707 - INFO - +Epoch 55 +2024-05-30 03:33:06,747 - INFO - [204800/569465] Loss : 0.5652 +2024-05-30 03:33:10,159 - INFO - [266240/569465] Loss : 0.5665 +2024-05-30 03:33:14,787 - INFO - Test loss : 0.5918 +2024-05-30 03:33:14,787 - INFO - +Epoch 56 +2024-05-30 03:33:26,842 - INFO - [204800/569465] Loss : 0.5639 +2024-05-30 03:33:30,140 - INFO - [266240/569465] Loss : 0.5648 +2024-05-30 03:33:34,860 - INFO - Test loss : 0.5806 +2024-05-30 03:33:34,860 - INFO - +Epoch 57 +2024-05-30 03:33:46,879 - INFO - [204800/569465] Loss : 0.5636 +2024-05-30 03:33:50,304 - INFO - [266240/569465] Loss : 0.5642 +2024-05-30 03:33:54,997 - INFO - Test loss : 0.6256 +2024-05-30 03:33:54,997 - INFO - +Epoch 58 +2024-05-30 03:34:07,024 - INFO - [204800/569465] Loss : 0.5639 +2024-05-30 03:34:10,453 - INFO - [266240/569465] Loss : 0.5647 +2024-05-30 03:34:15,119 - INFO - Test loss : 0.5970 +2024-05-30 03:34:15,119 - INFO - +Epoch 59 +2024-05-30 03:34:27,115 - INFO - [204800/569465] Loss : 0.5625 +2024-05-30 03:34:30,523 - INFO - [266240/569465] Loss : 0.5631 +2024-05-30 03:34:35,165 - INFO - Test loss : 0.5827 +2024-05-30 03:34:35,165 - INFO - +Epoch 60 +2024-05-30 03:34:47,146 - INFO - [204800/569465] Loss : 0.5616 +2024-05-30 03:34:50,586 - INFO - [266240/569465] Loss : 0.5632 +2024-05-30 03:34:55,242 - INFO - Test loss : 0.6067 +2024-05-30 03:34:56,713 - INFO - (93622,) +2024-05-30 03:35:43,272 - INFO - Split ID: 0 +2024-05-30 03:35:43,310 - INFO - Top 1 LocEnc (Epoch 60)acc (%): 0.81 +2024-05-30 03:35:43,313 - INFO - Top 3 LocEnc (Epoch 60)acc (%): 2.12 +2024-05-30 03:35:43,316 - INFO - Top 5 LocEnc (Epoch 60)acc (%): 3.23 +2024-05-30 03:35:43,319 - INFO - Top 10 LocEnc (Epoch 60)acc (%): 5.75 +2024-05-30 03:35:43,606 - INFO - +No prior +2024-05-30 03:35:44,230 - INFO - (95986,) +2024-05-30 03:36:02,671 - INFO - Split ID: 0 +2024-05-30 03:36:02,672 - INFO - Top 1 (Epoch 60)acc (%): 63.27 +2024-05-30 03:36:02,672 - INFO - Top 3 (Epoch 60)acc (%): 79.82 +2024-05-30 03:36:02,672 - INFO - Top 5 (Epoch 60)acc (%): 84.51 +2024-05-30 03:36:02,672 - INFO - Top 10 (Epoch 60)acc (%): 88.99 +2024-05-30 03:36:50,429 - INFO - Split ID: 0 +2024-05-30 03:36:50,430 - INFO - Top 1 (Epoch 60)acc (%): 69.36 +2024-05-30 03:36:50,430 - INFO - Top 3 (Epoch 60)acc (%): 84.26 +2024-05-30 03:36:50,430 - INFO - Top 5 (Epoch 60)acc (%): 88.01 +2024-05-30 03:36:50,430 - INFO - Top 10 (Epoch 60)acc (%): 91.58 +2024-05-30 03:36:50,432 - INFO - +Epoch 61 +2024-05-30 03:37:02,504 - INFO - [204800/569465] Loss : 0.5614 +2024-05-30 03:37:05,853 - INFO - [266240/569465] Loss : 0.5621 +2024-05-30 03:37:10,619 - INFO - Test loss : 0.5864 +2024-05-30 03:37:10,619 - INFO - +Epoch 62 +2024-05-30 03:37:22,720 - INFO - [204800/569465] Loss : 0.5615 +2024-05-30 03:37:26,058 - INFO - [266240/569465] Loss : 0.5619 +2024-05-30 03:37:30,809 - INFO - Test loss : 0.5677 +2024-05-30 03:37:30,809 - INFO - +Epoch 63 +2024-05-30 03:37:42,810 - INFO - [204800/569465] Loss : 0.5602 +2024-05-30 03:37:46,149 - INFO - [266240/569465] Loss : 0.5610 +2024-05-30 03:37:50,878 - INFO - Test loss : 0.5999 +2024-05-30 03:37:50,878 - INFO - +Epoch 64 +2024-05-30 03:38:02,917 - INFO - [204800/569465] Loss : 0.5601 +2024-05-30 03:38:06,252 - INFO - [266240/569465] Loss : 0.5605 +2024-05-30 03:38:11,008 - INFO - Test loss : 0.5882 +2024-05-30 03:38:11,009 - INFO - +Epoch 65 +2024-05-30 03:38:23,068 - INFO - [204800/569465] Loss : 0.5585 +2024-05-30 03:38:26,388 - INFO - [266240/569465] Loss : 0.5597 +2024-05-30 03:38:31,055 - INFO - Test loss : 0.5870 +2024-05-30 03:38:31,055 - INFO - +Epoch 66 +2024-05-30 03:38:43,105 - INFO - [204800/569465] Loss : 0.5585 +2024-05-30 03:38:46,510 - INFO - [266240/569465] Loss : 0.5595 +2024-05-30 03:38:51,177 - INFO - Test loss : 0.6165 +2024-05-30 03:38:51,178 - INFO - +Epoch 67 +2024-05-30 03:39:03,184 - INFO - [204800/569465] Loss : 0.5579 +2024-05-30 03:39:06,614 - INFO - [266240/569465] Loss : 0.5584 +2024-05-30 03:39:11,282 - INFO - Test loss : 0.5984 +2024-05-30 03:39:11,282 - INFO - +Epoch 68 +2024-05-30 03:39:23,326 - INFO - [204800/569465] Loss : 0.5567 +2024-05-30 03:39:26,776 - INFO - [266240/569465] Loss : 0.5577 +2024-05-30 03:39:31,448 - INFO - Test loss : 0.6003 +2024-05-30 03:39:31,449 - INFO - +Epoch 69 +2024-05-30 03:39:43,535 - INFO - [204800/569465] Loss : 0.5562 +2024-05-30 03:39:46,947 - INFO - [266240/569465] Loss : 0.5573 +2024-05-30 03:39:51,572 - INFO - Test loss : 0.5928 +2024-05-30 03:39:51,572 - INFO - +Epoch 70 +2024-05-30 03:40:03,573 - INFO - [204800/569465] Loss : 0.5561 +2024-05-30 03:40:06,990 - INFO - [266240/569465] Loss : 0.5571 +2024-05-30 03:40:11,667 - INFO - Test loss : 0.6037 +2024-05-30 03:40:13,160 - INFO - (93622,) +2024-05-30 03:40:59,820 - INFO - Split ID: 0 +2024-05-30 03:40:59,857 - INFO - Top 1 LocEnc (Epoch 70)acc (%): 0.81 +2024-05-30 03:40:59,860 - INFO - Top 3 LocEnc (Epoch 70)acc (%): 2.1 +2024-05-30 03:40:59,864 - INFO - Top 5 LocEnc (Epoch 70)acc (%): 3.24 +2024-05-30 03:40:59,867 - INFO - Top 10 LocEnc (Epoch 70)acc (%): 5.91 +2024-05-30 03:41:00,119 - INFO - +No prior +2024-05-30 03:41:00,679 - INFO - (95986,) +2024-05-30 03:41:18,887 - INFO - Split ID: 0 +2024-05-30 03:41:18,887 - INFO - Top 1 (Epoch 70)acc (%): 63.27 +2024-05-30 03:41:18,887 - INFO - Top 3 (Epoch 70)acc (%): 79.82 +2024-05-30 03:41:18,887 - INFO - Top 5 (Epoch 70)acc (%): 84.51 +2024-05-30 03:41:18,888 - INFO - Top 10 (Epoch 70)acc (%): 88.99 +2024-05-30 03:42:06,068 - INFO - Split ID: 0 +2024-05-30 03:42:06,068 - INFO - Top 1 (Epoch 70)acc (%): 69.44 +2024-05-30 03:42:06,070 - INFO - Top 3 (Epoch 70)acc (%): 84.26 +2024-05-30 03:42:06,070 - INFO - Top 5 (Epoch 70)acc (%): 88.01 +2024-05-30 03:42:06,070 - INFO - Top 10 (Epoch 70)acc (%): 91.57 +2024-05-30 03:42:06,072 - INFO - +Epoch 71 +2024-05-30 03:42:18,131 - INFO - [204800/569465] Loss : 0.5552 +2024-05-30 03:42:21,538 - INFO - [266240/569465] Loss : 0.5556 +2024-05-30 03:42:26,179 - INFO - Test loss : 0.6427 +2024-05-30 03:42:26,179 - INFO - +Epoch 72 +2024-05-30 03:42:38,249 - INFO - [204800/569465] Loss : 0.5548 +2024-05-30 03:42:41,662 - INFO - [266240/569465] Loss : 0.5559 +2024-05-30 03:42:46,333 - INFO - Test loss : 0.6191 +2024-05-30 03:42:46,333 - INFO - +Epoch 73 +2024-05-30 03:42:58,382 - INFO - [204800/569465] Loss : 0.5555 +2024-05-30 03:43:01,695 - INFO - [266240/569465] Loss : 0.5557 +2024-05-30 03:43:06,421 - INFO - Test loss : 0.6216 +2024-05-30 03:43:06,421 - INFO - +Epoch 74 +2024-05-30 03:43:18,448 - INFO - [204800/569465] Loss : 0.5543 +2024-05-30 03:43:21,891 - INFO - [266240/569465] Loss : 0.5550 +2024-05-30 03:43:26,549 - INFO - Test loss : 0.6086 +2024-05-30 03:43:26,549 - INFO - +Epoch 75 +2024-05-30 03:43:38,582 - INFO - [204800/569465] Loss : 0.5539 +2024-05-30 03:43:41,989 - INFO - [266240/569465] Loss : 0.5544 +2024-05-30 03:43:46,675 - INFO - Test loss : 0.6310 +2024-05-30 03:43:46,676 - INFO - +Epoch 76 +2024-05-30 03:43:58,758 - INFO - [204800/569465] Loss : 0.5526 +2024-05-30 03:44:02,228 - INFO - [266240/569465] Loss : 0.5535 +2024-05-30 03:44:06,909 - INFO - Test loss : 0.6213 +2024-05-30 03:44:06,910 - INFO - +Epoch 77 +2024-05-30 03:44:18,965 - INFO - [204800/569465] Loss : 0.5523 +2024-05-30 03:44:22,397 - INFO - [266240/569465] Loss : 0.5529 +2024-05-30 03:44:27,064 - INFO - Test loss : 0.6223 +2024-05-30 03:44:27,064 - INFO - +Epoch 78 +2024-05-30 03:44:39,169 - INFO - [204800/569465] Loss : 0.5523 +2024-05-30 03:44:42,507 - INFO - [266240/569465] Loss : 0.5533 +2024-05-30 03:44:47,275 - INFO - Test loss : 0.6292 +2024-05-30 03:44:47,275 - INFO - +Epoch 79 +2024-05-30 03:44:59,373 - INFO - [204800/569465] Loss : 0.5508 +2024-05-30 03:45:02,709 - INFO - [266240/569465] Loss : 0.5515 +2024-05-30 03:45:07,487 - INFO - Test loss : 0.6273 +2024-05-30 03:45:07,488 - INFO - +Epoch 80 +2024-05-30 03:45:19,525 - INFO - [204800/569465] Loss : 0.5504 +2024-05-30 03:45:22,906 - INFO - [266240/569465] Loss : 0.5514 +2024-05-30 03:45:27,655 - INFO - Test loss : 0.6076 +2024-05-30 03:45:29,187 - INFO - (93622,) +2024-05-30 03:46:16,019 - INFO - Split ID: 0 +2024-05-30 03:46:16,057 - INFO - Top 1 LocEnc (Epoch 80)acc (%): 0.76 +2024-05-30 03:46:16,060 - INFO - Top 3 LocEnc (Epoch 80)acc (%): 2.15 +2024-05-30 03:46:16,063 - INFO - Top 5 LocEnc (Epoch 80)acc (%): 3.33 +2024-05-30 03:46:16,066 - INFO - Top 10 LocEnc (Epoch 80)acc (%): 5.96 +2024-05-30 03:46:16,332 - INFO - +No prior +2024-05-30 03:46:16,986 - INFO - (95986,) +2024-05-30 03:46:35,407 - INFO - Split ID: 0 +2024-05-30 03:46:35,408 - INFO - Top 1 (Epoch 80)acc (%): 63.27 +2024-05-30 03:46:35,408 - INFO - Top 3 (Epoch 80)acc (%): 79.82 +2024-05-30 03:46:35,408 - INFO - Top 5 (Epoch 80)acc (%): 84.51 +2024-05-30 03:46:35,408 - INFO - Top 10 (Epoch 80)acc (%): 88.99 +2024-05-30 03:47:22,736 - INFO - Split ID: 0 +2024-05-30 03:47:22,737 - INFO - Top 1 (Epoch 80)acc (%): 69.43 +2024-05-30 03:47:22,738 - INFO - Top 3 (Epoch 80)acc (%): 84.24 +2024-05-30 03:47:22,738 - INFO - Top 5 (Epoch 80)acc (%): 87.97 +2024-05-30 03:47:22,739 - INFO - Top 10 (Epoch 80)acc (%): 91.54 +2024-05-30 03:47:22,740 - INFO - +Epoch 81 +2024-05-30 03:47:34,796 - INFO - [204800/569465] Loss : 0.5516 +2024-05-30 03:47:38,140 - INFO - [266240/569465] Loss : 0.5518 +2024-05-30 03:47:42,891 - INFO - Test loss : 0.6140 +2024-05-30 03:47:42,891 - INFO - +Epoch 82 +2024-05-30 03:47:54,960 - INFO - [204800/569465] Loss : 0.5507 +2024-05-30 03:47:58,323 - INFO - [266240/569465] Loss : 0.5517 +2024-05-30 03:48:03,003 - INFO - Test loss : 0.6237 +2024-05-30 03:48:03,003 - INFO - +Epoch 83 +2024-05-30 03:48:15,089 - INFO - [204800/569465] Loss : 0.5495 +2024-05-30 03:48:18,521 - INFO - [266240/569465] Loss : 0.5505 +2024-05-30 03:48:23,176 - INFO - Test loss : 0.6413 +2024-05-30 03:48:23,176 - INFO - +Epoch 84 +2024-05-30 03:48:35,268 - INFO - [204800/569465] Loss : 0.5499 +2024-05-30 03:48:38,697 - INFO - [266240/569465] Loss : 0.5506 +2024-05-30 03:48:43,340 - INFO - Test loss : 0.6292 +2024-05-30 03:48:43,340 - INFO - +Epoch 85 +2024-05-30 03:48:55,371 - INFO - [204800/569465] Loss : 0.5490 +2024-05-30 03:48:58,786 - INFO - [266240/569465] Loss : 0.5499 +2024-05-30 03:49:03,480 - INFO - Test loss : 0.6374 +2024-05-30 03:49:03,480 - INFO - +Epoch 86 +2024-05-30 03:49:15,510 - INFO - [204800/569465] Loss : 0.5483 +2024-05-30 03:49:18,905 - INFO - [266240/569465] Loss : 0.5489 +2024-05-30 03:49:23,536 - INFO - Test loss : 0.6251 +2024-05-30 03:49:23,536 - INFO - +Epoch 87 +2024-05-30 03:49:35,614 - INFO - [204800/569465] Loss : 0.5478 +2024-05-30 03:49:39,058 - INFO - [266240/569465] Loss : 0.5487 +2024-05-30 03:49:43,722 - INFO - Test loss : 0.6404 +2024-05-30 03:49:43,722 - INFO - +Epoch 88 +2024-05-30 03:49:55,808 - INFO - [204800/569465] Loss : 0.5474 +2024-05-30 03:49:59,250 - INFO - [266240/569465] Loss : 0.5487 +2024-05-30 03:50:03,910 - INFO - Test loss : 0.6217 +2024-05-30 03:50:03,910 - INFO - +Epoch 89 +2024-05-30 03:50:15,756 - INFO - [204800/569465] Loss : 0.5480 +2024-05-30 03:50:19,134 - INFO - [266240/569465] Loss : 0.5484 +2024-05-30 03:50:23,728 - INFO - Test loss : 0.6532 +2024-05-30 03:50:23,729 - INFO - +Epoch 90 +2024-05-30 03:50:35,597 - INFO - [204800/569465] Loss : 0.5467 +2024-05-30 03:50:38,882 - INFO - [266240/569465] Loss : 0.5472 +2024-05-30 03:50:43,556 - INFO - Test loss : 0.6473 +2024-05-30 03:50:44,924 - INFO - (93622,) +2024-05-30 03:51:31,509 - INFO - Split ID: 0 +2024-05-30 03:51:31,546 - INFO - Top 1 LocEnc (Epoch 90)acc (%): 0.79 +2024-05-30 03:51:31,550 - INFO - Top 3 LocEnc (Epoch 90)acc (%): 2.14 +2024-05-30 03:51:31,553 - INFO - Top 5 LocEnc (Epoch 90)acc (%): 3.32 +2024-05-30 03:51:31,556 - INFO - Top 10 LocEnc (Epoch 90)acc (%): 5.81 +2024-05-30 03:51:31,816 - INFO - +No prior +2024-05-30 03:51:32,446 - INFO - (95986,) +2024-05-30 03:51:50,734 - INFO - Split ID: 0 +2024-05-30 03:51:50,734 - INFO - Top 1 (Epoch 90)acc (%): 63.27 +2024-05-30 03:51:50,735 - INFO - Top 3 (Epoch 90)acc (%): 79.82 +2024-05-30 03:51:50,735 - INFO - Top 5 (Epoch 90)acc (%): 84.51 +2024-05-30 03:51:50,735 - INFO - Top 10 (Epoch 90)acc (%): 88.99 +2024-05-30 03:52:37,952 - INFO - Split ID: 0 +2024-05-30 03:52:37,953 - INFO - Top 1 (Epoch 90)acc (%): 69.36 +2024-05-30 03:52:37,953 - INFO - Top 3 (Epoch 90)acc (%): 84.13 +2024-05-30 03:52:37,953 - INFO - Top 5 (Epoch 90)acc (%): 87.89 +2024-05-30 03:52:37,953 - INFO - Top 10 (Epoch 90)acc (%): 91.47 +2024-05-30 03:52:37,955 - INFO - +Epoch 91 +2024-05-30 03:52:50,016 - INFO - [204800/569465] Loss : 0.5473 +2024-05-30 03:52:53,451 - INFO - [266240/569465] Loss : 0.5480 +2024-05-30 03:52:58,117 - INFO - Test loss : 0.6493 +2024-05-30 03:52:58,118 - INFO - +Epoch 92 +2024-05-30 03:53:10,184 - INFO - [204800/569465] Loss : 0.5465 +2024-05-30 03:53:13,626 - INFO - [266240/569465] Loss : 0.5467 +2024-05-30 03:53:18,282 - INFO - Test loss : 0.6497 +2024-05-30 03:53:18,283 - INFO - +Epoch 93 +2024-05-30 03:53:30,338 - INFO - [204800/569465] Loss : 0.5464 +2024-05-30 03:53:33,747 - INFO - [266240/569465] Loss : 0.5468 +2024-05-30 03:53:38,430 - INFO - Test loss : 0.6395 +2024-05-30 03:53:38,430 - INFO - +Epoch 94 +2024-05-30 03:53:50,509 - INFO - [204800/569465] Loss : 0.5448 +2024-05-30 03:53:53,950 - INFO - [266240/569465] Loss : 0.5457 +2024-05-30 03:53:58,610 - INFO - Test loss : 0.6528 +2024-05-30 03:53:58,610 - INFO - +Epoch 95 +2024-05-30 03:54:10,700 - INFO - [204800/569465] Loss : 0.5452 +2024-05-30 03:54:14,025 - INFO - [266240/569465] Loss : 0.5461 +2024-05-30 03:54:18,791 - INFO - Test loss : 0.6392 +2024-05-30 03:54:18,791 - INFO - +Epoch 96 +2024-05-30 03:54:30,871 - INFO - [204800/569465] Loss : 0.5449 +2024-05-30 03:54:34,220 - INFO - [266240/569465] Loss : 0.5457 +2024-05-30 03:54:38,993 - INFO - Test loss : 0.6306 +2024-05-30 03:54:38,994 - INFO - +Epoch 97 +2024-05-30 03:54:51,003 - INFO - [204800/569465] Loss : 0.5437 +2024-05-30 03:54:54,301 - INFO - [266240/569465] Loss : 0.5449 +2024-05-30 03:54:59,023 - INFO - Test loss : 0.6461 +2024-05-30 03:54:59,023 - INFO - +Epoch 98 +2024-05-30 03:55:11,011 - INFO - [204800/569465] Loss : 0.5448 +2024-05-30 03:55:14,354 - INFO - [266240/569465] Loss : 0.5453 +2024-05-30 03:55:19,098 - INFO - Test loss : 0.6566 +2024-05-30 03:55:19,098 - INFO - +Epoch 99 +2024-05-30 03:55:31,165 - INFO - [204800/569465] Loss : 0.5447 +2024-05-30 03:55:34,516 - INFO - [266240/569465] Loss : 0.5454 +2024-05-30 03:55:39,179 - INFO - Test loss : 0.6480 +2024-05-30 03:55:39,179 - INFO - Saving output model to ../models/sphere2vec_dfs/model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-30 03:55:39,230 - INFO - Saving output model to ../models/sphere2vec_dfs/model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-30 03:55:39,423 - INFO - +No prior +2024-05-30 03:55:40,069 - INFO - (95986,) +2024-05-30 03:55:58,589 - INFO - Split ID: 0 +2024-05-30 03:55:58,589 - INFO - Top 1 acc (%): 63.27 +2024-05-30 03:55:58,590 - INFO - Top 3 acc (%): 79.82 +2024-05-30 03:55:58,590 - INFO - Top 5 acc (%): 84.51 +2024-05-30 03:55:58,590 - INFO - Top 10 acc (%): 88.99 +2024-05-30 03:56:46,228 - INFO - Split ID: 0 +2024-05-30 03:56:46,229 - INFO - Top 1 acc (%): 69.42 +2024-05-30 03:56:46,229 - INFO - Top 3 acc (%): 84.14 +2024-05-30 03:56:46,229 - INFO - Top 5 acc (%): 87.9 +2024-05-30 03:56:46,229 - INFO - Top 10 acc (%): 91.46 +2024-05-30 03:56:46,231 - INFO - +Sphere2Vec-dfs +2024-05-30 03:56:46,231 - INFO - Model : model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-30 03:56:47,829 - INFO - (93622,) +2024-05-30 03:57:34,757 - INFO - Split ID: 0 +2024-05-30 03:57:34,796 - INFO - Top 1 LocEnc acc (%): 0.8 +2024-05-30 03:57:34,799 - INFO - Top 3 LocEnc acc (%): 2.14 +2024-05-30 03:57:34,802 - INFO - Top 5 LocEnc acc (%): 3.34 +2024-05-30 03:57:34,806 - INFO - Top 10 LocEnc acc (%): 5.92 +2024-05-31 04:02:26,711 - INFO - +num_classes 5089 +2024-05-31 04:02:26,711 - INFO - num train 569465 +2024-05-31 04:02:26,711 - INFO - num val 93622 +2024-05-31 04:02:26,711 - INFO - train loss full_loss +2024-05-31 04:02:26,711 - INFO - model name ../models/sphere2vec_dfs/model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 04:02:26,711 - INFO - num users 17302 +2024-05-31 04:02:27,578 - INFO - +Only Sphere2Vec-dfs +2024-05-31 04:02:27,579 - INFO - Model : model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 04:02:27,950 - INFO - Saving output model to ../models/sphere2vec_dfs/model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 04:02:28,165 - INFO - Saving output model to ../models/sphere2vec_dfs/model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 04:02:28,523 - INFO - +No prior +2024-05-31 04:02:29,267 - INFO - (95986,) +2024-05-31 04:03:00,789 - INFO - Save results to ../eval_results/eval_inat_2017__val_no_prior.csv +2024-05-31 04:03:00,789 - INFO - Split ID: 0 +2024-05-31 04:03:00,790 - INFO - Top 1 acc (%): 63.27 +2024-05-31 04:03:00,790 - INFO - Top 3 acc (%): 79.82 +2024-05-31 04:03:00,790 - INFO - Top 5 acc (%): 84.51 +2024-05-31 04:03:00,790 - INFO - Top 10 acc (%): 88.99 +2024-05-31 04:04:14,728 - INFO - Split ID: 0 +2024-05-31 04:04:14,729 - INFO - Top 1 hit (%): 69.42 +2024-05-31 04:04:14,729 - INFO - Top 3 hit (%): 84.14 +2024-05-31 04:04:14,729 - INFO - Top 5 hit (%): 87.9 +2024-05-31 04:04:14,729 - INFO - Top 10 hit (%): 91.46 +2024-05-31 04:04:14,754 - INFO - +Only Sphere2Vec-dfs +2024-05-31 04:04:14,754 - INFO - Model : model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 04:04:16,504 - INFO - (93622,) +2024-05-31 04:05:02,412 - INFO - Split ID: 0 +2024-05-31 04:05:02,444 - INFO - Top 1 LocEnc acc (%): 0.8 +2024-05-31 04:05:02,447 - INFO - Top 3 LocEnc acc (%): 2.14 +2024-05-31 04:05:02,450 - INFO - Top 5 LocEnc acc (%): 3.34 +2024-05-31 04:05:02,453 - INFO - Top 10 LocEnc acc (%): 5.92 diff --git a/pre_trained_models/sphere2vec_dfs/model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/sphere2vec_dfs/model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..67195232 Binary files /dev/null and b/pre_trained_models/sphere2vec_dfs/model_inat_2017_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0100000_1.000_1_512_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/sphere2vec_dfs/model_inat_2018_Sphere2Vec-dfs_0.0100_8_0.0001000_1.000_1_512_BATCH4096_leakyrelu.log b/pre_trained_models/sphere2vec_dfs/model_inat_2018_Sphere2Vec-dfs_0.0100_8_0.0001000_1.000_1_512_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..e9d458ee --- /dev/null +++ b/pre_trained_models/sphere2vec_dfs/model_inat_2018_Sphere2Vec-dfs_0.0100_8_0.0001000_1.000_1_512_BATCH4096_leakyrelu.log @@ -0,0 +1,532 @@ +2024-05-31 09:14:34,636 - INFO - +num_classes 8142 +2024-05-31 09:14:34,636 - INFO - num train 436063 +2024-05-31 09:14:34,636 - INFO - num val 24343 +2024-05-31 09:14:34,636 - INFO - train loss full_loss +2024-05-31 09:14:34,636 - INFO - model name ../models/sphere2vec_dfs/model_inat_2018_Sphere2Vec-dfs_0.0100_8_0.0001000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 09:14:34,636 - INFO - num users 18643 +2024-05-31 09:14:35,529 - INFO - +Epoch 0 +2024-05-31 09:14:48,763 - INFO - [204800/436063] Loss : 1.3299 +2024-05-31 09:14:52,047 - INFO - [262144/436063] Loss : 1.2484 +2024-05-31 09:14:53,181 - INFO - Test loss : 0.3312 +2024-05-31 09:14:53,181 - INFO - +Epoch 1 +2024-05-31 09:15:05,927 - INFO - [204800/436063] Loss : 0.7958 +2024-05-31 09:15:09,233 - INFO - [262144/436063] Loss : 0.7831 +2024-05-31 09:15:10,458 - INFO - Test loss : 0.2415 +2024-05-31 09:15:10,458 - INFO - +Epoch 2 +2024-05-31 09:15:23,130 - INFO - [204800/436063] Loss : 0.6801 +2024-05-31 09:15:26,432 - INFO - [262144/436063] Loss : 0.6746 +2024-05-31 09:15:27,655 - INFO - Test loss : 0.2033 +2024-05-31 09:15:27,655 - INFO - +Epoch 3 +2024-05-31 09:15:40,310 - INFO - [204800/436063] Loss : 0.6240 +2024-05-31 09:15:43,612 - INFO - [262144/436063] Loss : 0.6222 +2024-05-31 09:15:44,833 - INFO - Test loss : 0.2148 +2024-05-31 09:15:44,833 - INFO - +Epoch 4 +2024-05-31 09:15:57,474 - INFO - [204800/436063] Loss : 0.5910 +2024-05-31 09:16:00,766 - INFO - [262144/436063] Loss : 0.5894 +2024-05-31 09:16:01,987 - INFO - Test loss : 0.1833 +2024-05-31 09:16:01,987 - INFO - +Epoch 5 +2024-05-31 09:16:14,711 - INFO - [204800/436063] Loss : 0.5680 +2024-05-31 09:16:17,912 - INFO - [262144/436063] Loss : 0.5681 +2024-05-31 09:16:19,137 - INFO - Test loss : 0.1830 +2024-05-31 09:16:19,137 - INFO - +Epoch 6 +2024-05-31 09:16:31,864 - INFO - [204800/436063] Loss : 0.5548 +2024-05-31 09:16:35,087 - INFO - [262144/436063] Loss : 0.5548 +2024-05-31 09:16:36,308 - INFO - Test loss : 0.1955 +2024-05-31 09:16:36,309 - INFO - +Epoch 7 +2024-05-31 09:16:49,014 - INFO - [204800/436063] Loss : 0.5425 +2024-05-31 09:16:52,306 - INFO - [262144/436063] Loss : 0.5427 +2024-05-31 09:16:53,428 - INFO - Test loss : 0.1709 +2024-05-31 09:16:53,428 - INFO - +Epoch 8 +2024-05-31 09:17:06,138 - INFO - [204800/436063] Loss : 0.5323 +2024-05-31 09:17:09,421 - INFO - [262144/436063] Loss : 0.5320 +2024-05-31 09:17:10,638 - INFO - Test loss : 0.1889 +2024-05-31 09:17:10,638 - INFO - +Epoch 9 +2024-05-31 09:17:23,230 - INFO - [204800/436063] Loss : 0.5244 +2024-05-31 09:17:26,501 - INFO - [262144/436063] Loss : 0.5242 +2024-05-31 09:17:27,714 - INFO - Test loss : 0.2226 +2024-05-31 09:17:27,714 - INFO - +Epoch 10 +2024-05-31 09:17:40,289 - INFO - [204800/436063] Loss : 0.5158 +2024-05-31 09:17:43,574 - INFO - [262144/436063] Loss : 0.5159 +2024-05-31 09:17:44,791 - INFO - Test loss : 0.1880 +2024-05-31 09:17:44,791 - INFO - +Epoch 11 +2024-05-31 09:17:57,579 - INFO - [204800/436063] Loss : 0.5092 +2024-05-31 09:18:00,785 - INFO - [262144/436063] Loss : 0.5097 +2024-05-31 09:18:02,010 - INFO - Test loss : 0.2064 +2024-05-31 09:18:02,010 - INFO - +Epoch 12 +2024-05-31 09:18:14,694 - INFO - [204800/436063] Loss : 0.5030 +2024-05-31 09:18:17,889 - INFO - [262144/436063] Loss : 0.5040 +2024-05-31 09:18:19,106 - INFO - Test loss : 0.1819 +2024-05-31 09:18:19,106 - INFO - +Epoch 13 +2024-05-31 09:18:31,775 - INFO - [204800/436063] Loss : 0.4984 +2024-05-31 09:18:35,045 - INFO - [262144/436063] Loss : 0.4994 +2024-05-31 09:18:36,170 - INFO - Test loss : 0.1960 +2024-05-31 09:18:36,170 - INFO - +Epoch 14 +2024-05-31 09:18:48,837 - INFO - [204800/436063] Loss : 0.4946 +2024-05-31 09:18:52,127 - INFO - [262144/436063] Loss : 0.4945 +2024-05-31 09:18:53,347 - INFO - Test loss : 0.1894 +2024-05-31 09:18:53,347 - INFO - +Epoch 15 +2024-05-31 09:19:05,924 - INFO - [204800/436063] Loss : 0.4898 +2024-05-31 09:19:09,198 - INFO - [262144/436063] Loss : 0.4907 +2024-05-31 09:19:10,407 - INFO - Test loss : 0.2054 +2024-05-31 09:19:10,407 - INFO - +Epoch 16 +2024-05-31 09:19:22,947 - INFO - [204800/436063] Loss : 0.4857 +2024-05-31 09:19:26,212 - INFO - [262144/436063] Loss : 0.4865 +2024-05-31 09:19:27,427 - INFO - Test loss : 0.1961 +2024-05-31 09:19:27,427 - INFO - +Epoch 17 +2024-05-31 09:19:40,056 - INFO - [204800/436063] Loss : 0.4813 +2024-05-31 09:19:43,238 - INFO - [262144/436063] Loss : 0.4819 +2024-05-31 09:19:44,452 - INFO - Test loss : 0.2037 +2024-05-31 09:19:44,453 - INFO - +Epoch 18 +2024-05-31 09:19:57,107 - INFO - [204800/436063] Loss : 0.4787 +2024-05-31 09:20:00,299 - INFO - [262144/436063] Loss : 0.4792 +2024-05-31 09:20:01,521 - INFO - Test loss : 0.2068 +2024-05-31 09:20:01,521 - INFO - +Epoch 19 +2024-05-31 09:20:14,197 - INFO - [204800/436063] Loss : 0.4751 +2024-05-31 09:20:17,483 - INFO - [262144/436063] Loss : 0.4757 +2024-05-31 09:20:18,619 - INFO - Test loss : 0.2080 +2024-05-31 09:20:18,620 - INFO - +Epoch 20 +2024-05-31 09:20:31,294 - INFO - [204800/436063] Loss : 0.4732 +2024-05-31 09:20:34,580 - INFO - [262144/436063] Loss : 0.4739 +2024-05-31 09:20:35,801 - INFO - Test loss : 0.2184 +2024-05-31 09:20:35,801 - INFO - +Epoch 21 +2024-05-31 09:20:48,386 - INFO - [204800/436063] Loss : 0.4705 +2024-05-31 09:20:51,674 - INFO - [262144/436063] Loss : 0.4709 +2024-05-31 09:20:52,895 - INFO - Test loss : 0.2086 +2024-05-31 09:20:52,895 - INFO - +Epoch 22 +2024-05-31 09:21:05,467 - INFO - [204800/436063] Loss : 0.4682 +2024-05-31 09:21:08,738 - INFO - [262144/436063] Loss : 0.4687 +2024-05-31 09:21:09,954 - INFO - Test loss : 0.2051 +2024-05-31 09:21:09,955 - INFO - +Epoch 23 +2024-05-31 09:21:22,605 - INFO - [204800/436063] Loss : 0.4652 +2024-05-31 09:21:25,787 - INFO - [262144/436063] Loss : 0.4658 +2024-05-31 09:21:27,004 - INFO - Test loss : 0.2130 +2024-05-31 09:21:27,004 - INFO - +Epoch 24 +2024-05-31 09:21:39,554 - INFO - [204800/436063] Loss : 0.4637 +2024-05-31 09:21:42,703 - INFO - [262144/436063] Loss : 0.4644 +2024-05-31 09:21:43,917 - INFO - Test loss : 0.2249 +2024-05-31 09:21:43,918 - INFO - +Epoch 25 +2024-05-31 09:21:56,093 - INFO - [204800/436063] Loss : 0.4605 +2024-05-31 09:21:59,259 - INFO - [262144/436063] Loss : 0.4616 +2024-05-31 09:22:00,391 - INFO - Test loss : 0.2240 +2024-05-31 09:22:00,391 - INFO - +Epoch 26 +2024-05-31 09:22:12,570 - INFO - [204800/436063] Loss : 0.4607 +2024-05-31 09:22:15,723 - INFO - [262144/436063] Loss : 0.4610 +2024-05-31 09:22:16,933 - INFO - Test loss : 0.2107 +2024-05-31 09:22:16,935 - INFO - +Epoch 27 +2024-05-31 09:22:29,364 - INFO - [204800/436063] Loss : 0.4583 +2024-05-31 09:22:32,607 - INFO - [262144/436063] Loss : 0.4585 +2024-05-31 09:22:33,820 - INFO - Test loss : 0.2191 +2024-05-31 09:22:33,820 - INFO - +Epoch 28 +2024-05-31 09:22:45,925 - INFO - [204800/436063] Loss : 0.4564 +2024-05-31 09:22:49,080 - INFO - [262144/436063] Loss : 0.4572 +2024-05-31 09:22:50,288 - INFO - Test loss : 0.2236 +2024-05-31 09:22:50,290 - INFO - +Epoch 29 +2024-05-31 09:23:02,960 - INFO - [204800/436063] Loss : 0.4521 +2024-05-31 09:23:06,147 - INFO - [262144/436063] Loss : 0.4540 +2024-05-31 09:23:07,362 - INFO - Test loss : 0.2113 +2024-05-31 09:23:07,363 - INFO - +Epoch 30 +2024-05-31 09:23:20,008 - INFO - [204800/436063] Loss : 0.4524 +2024-05-31 09:23:23,193 - INFO - [262144/436063] Loss : 0.4534 +2024-05-31 09:23:24,407 - INFO - Test loss : 0.2329 +2024-05-31 09:23:24,407 - INFO - +Epoch 31 +2024-05-31 09:23:36,927 - INFO - [204800/436063] Loss : 0.4504 +2024-05-31 09:23:40,171 - INFO - [262144/436063] Loss : 0.4511 +2024-05-31 09:23:41,292 - INFO - Test loss : 0.2004 +2024-05-31 09:23:41,293 - INFO - +Epoch 32 +2024-05-31 09:23:53,931 - INFO - [204800/436063] Loss : 0.4504 +2024-05-31 09:23:57,208 - INFO - [262144/436063] Loss : 0.4512 +2024-05-31 09:23:58,422 - INFO - Test loss : 0.2263 +2024-05-31 09:23:58,422 - INFO - +Epoch 33 +2024-05-31 09:24:11,006 - INFO - [204800/436063] Loss : 0.4477 +2024-05-31 09:24:14,277 - INFO - [262144/436063] Loss : 0.4487 +2024-05-31 09:24:15,490 - INFO - Test loss : 0.2102 +2024-05-31 09:24:15,490 - INFO - +Epoch 34 +2024-05-31 09:24:28,038 - INFO - [204800/436063] Loss : 0.4464 +2024-05-31 09:24:31,300 - INFO - [262144/436063] Loss : 0.4469 +2024-05-31 09:24:32,512 - INFO - Test loss : 0.2260 +2024-05-31 09:24:32,512 - INFO - +Epoch 35 +2024-05-31 09:24:45,186 - INFO - [204800/436063] Loss : 0.4453 +2024-05-31 09:24:48,372 - INFO - [262144/436063] Loss : 0.4459 +2024-05-31 09:24:49,585 - INFO - Test loss : 0.2378 +2024-05-31 09:24:49,585 - INFO - +Epoch 36 +2024-05-31 09:25:02,253 - INFO - [204800/436063] Loss : 0.4439 +2024-05-31 09:25:05,437 - INFO - [262144/436063] Loss : 0.4450 +2024-05-31 09:25:06,658 - INFO - Test loss : 0.2261 +2024-05-31 09:25:06,658 - INFO - +Epoch 37 +2024-05-31 09:25:19,334 - INFO - [204800/436063] Loss : 0.4426 +2024-05-31 09:25:22,616 - INFO - [262144/436063] Loss : 0.4439 +2024-05-31 09:25:23,749 - INFO - Test loss : 0.2437 +2024-05-31 09:25:23,749 - INFO - +Epoch 38 +2024-05-31 09:25:36,407 - INFO - [204800/436063] Loss : 0.4426 +2024-05-31 09:25:39,697 - INFO - [262144/436063] Loss : 0.4433 +2024-05-31 09:25:40,914 - INFO - Test loss : 0.2284 +2024-05-31 09:25:40,914 - INFO - +Epoch 39 +2024-05-31 09:25:53,487 - INFO - [204800/436063] Loss : 0.4407 +2024-05-31 09:25:56,771 - INFO - [262144/436063] Loss : 0.4417 +2024-05-31 09:25:57,988 - INFO - Test loss : 0.2296 +2024-05-31 09:25:57,988 - INFO - +Epoch 40 +2024-05-31 09:26:10,541 - INFO - [204800/436063] Loss : 0.4393 +2024-05-31 09:26:13,809 - INFO - [262144/436063] Loss : 0.4403 +2024-05-31 09:26:15,023 - INFO - Test loss : 0.2261 +2024-05-31 09:26:15,023 - INFO - +Epoch 41 +2024-05-31 09:26:27,533 - INFO - [204800/436063] Loss : 0.4381 +2024-05-31 09:26:30,676 - INFO - [262144/436063] Loss : 0.4396 +2024-05-31 09:26:31,897 - INFO - Test loss : 0.2159 +2024-05-31 09:26:31,897 - INFO - +Epoch 42 +2024-05-31 09:26:44,058 - INFO - [204800/436063] Loss : 0.4368 +2024-05-31 09:26:47,118 - INFO - [262144/436063] Loss : 0.4373 +2024-05-31 09:26:48,326 - INFO - Test loss : 0.2271 +2024-05-31 09:26:48,328 - INFO - +Epoch 43 +2024-05-31 09:27:00,948 - INFO - [204800/436063] Loss : 0.4364 +2024-05-31 09:27:04,220 - INFO - [262144/436063] Loss : 0.4372 +2024-05-31 09:27:05,343 - INFO - Test loss : 0.2188 +2024-05-31 09:27:05,343 - INFO - +Epoch 44 +2024-05-31 09:27:17,988 - INFO - [204800/436063] Loss : 0.4355 +2024-05-31 09:27:21,260 - INFO - [262144/436063] Loss : 0.4362 +2024-05-31 09:27:22,472 - INFO - Test loss : 0.2465 +2024-05-31 09:27:22,472 - INFO - +Epoch 45 +2024-05-31 09:27:35,001 - INFO - [204800/436063] Loss : 0.4344 +2024-05-31 09:27:38,295 - INFO - [262144/436063] Loss : 0.4351 +2024-05-31 09:27:39,514 - INFO - Test loss : 0.2318 +2024-05-31 09:27:39,514 - INFO - +Epoch 46 +2024-05-31 09:27:52,113 - INFO - [204800/436063] Loss : 0.4339 +2024-05-31 09:27:55,404 - INFO - [262144/436063] Loss : 0.4343 +2024-05-31 09:27:56,624 - INFO - Test loss : 0.2303 +2024-05-31 09:27:56,624 - INFO - +Epoch 47 +2024-05-31 09:28:09,350 - INFO - [204800/436063] Loss : 0.4316 +2024-05-31 09:28:12,538 - INFO - [262144/436063] Loss : 0.4327 +2024-05-31 09:28:13,758 - INFO - Test loss : 0.2363 +2024-05-31 09:28:13,758 - INFO - +Epoch 48 +2024-05-31 09:28:26,300 - INFO - [204800/436063] Loss : 0.4320 +2024-05-31 09:28:29,453 - INFO - [262144/436063] Loss : 0.4328 +2024-05-31 09:28:30,667 - INFO - Test loss : 0.2265 +2024-05-31 09:28:30,667 - INFO - +Epoch 49 +2024-05-31 09:28:43,345 - INFO - [204800/436063] Loss : 0.4299 +2024-05-31 09:28:46,649 - INFO - [262144/436063] Loss : 0.4312 +2024-05-31 09:28:47,777 - INFO - Test loss : 0.2346 +2024-05-31 09:28:47,777 - INFO - +Epoch 50 +2024-05-31 09:29:00,461 - INFO - [204800/436063] Loss : 0.4298 +2024-05-31 09:29:03,759 - INFO - [262144/436063] Loss : 0.4307 +2024-05-31 09:29:04,988 - INFO - Test loss : 0.2408 +2024-05-31 09:29:04,988 - INFO - +Epoch 51 +2024-05-31 09:29:17,572 - INFO - [204800/436063] Loss : 0.4300 +2024-05-31 09:29:20,850 - INFO - [262144/436063] Loss : 0.4303 +2024-05-31 09:29:22,066 - INFO - Test loss : 0.2417 +2024-05-31 09:29:22,066 - INFO - +Epoch 52 +2024-05-31 09:29:34,636 - INFO - [204800/436063] Loss : 0.4282 +2024-05-31 09:29:37,914 - INFO - [262144/436063] Loss : 0.4289 +2024-05-31 09:29:39,131 - INFO - Test loss : 0.2433 +2024-05-31 09:29:39,132 - INFO - +Epoch 53 +2024-05-31 09:29:51,877 - INFO - [204800/436063] Loss : 0.4267 +2024-05-31 09:29:55,069 - INFO - [262144/436063] Loss : 0.4276 +2024-05-31 09:29:56,287 - INFO - Test loss : 0.2440 +2024-05-31 09:29:56,287 - INFO - +Epoch 54 +2024-05-31 09:30:08,957 - INFO - [204800/436063] Loss : 0.4264 +2024-05-31 09:30:12,154 - INFO - [262144/436063] Loss : 0.4270 +2024-05-31 09:30:13,370 - INFO - Test loss : 0.2318 +2024-05-31 09:30:13,370 - INFO - +Epoch 55 +2024-05-31 09:30:25,911 - INFO - [204800/436063] Loss : 0.4257 +2024-05-31 09:30:29,159 - INFO - [262144/436063] Loss : 0.4263 +2024-05-31 09:30:30,300 - INFO - Test loss : 0.2256 +2024-05-31 09:30:30,300 - INFO - +Epoch 56 +2024-05-31 09:30:42,981 - INFO - [204800/436063] Loss : 0.4251 +2024-05-31 09:30:46,266 - INFO - [262144/436063] Loss : 0.4258 +2024-05-31 09:30:47,483 - INFO - Test loss : 0.2541 +2024-05-31 09:30:47,484 - INFO - +Epoch 57 +2024-05-31 09:31:00,073 - INFO - [204800/436063] Loss : 0.4239 +2024-05-31 09:31:03,354 - INFO - [262144/436063] Loss : 0.4247 +2024-05-31 09:31:04,567 - INFO - Test loss : 0.2518 +2024-05-31 09:31:04,567 - INFO - +Epoch 58 +2024-05-31 09:31:17,100 - INFO - [204800/436063] Loss : 0.4235 +2024-05-31 09:31:20,366 - INFO - [262144/436063] Loss : 0.4243 +2024-05-31 09:31:21,579 - INFO - Test loss : 0.2481 +2024-05-31 09:31:21,579 - INFO - +Epoch 59 +2024-05-31 09:31:34,205 - INFO - [204800/436063] Loss : 0.4228 +2024-05-31 09:31:37,396 - INFO - [262144/436063] Loss : 0.4241 +2024-05-31 09:31:38,613 - INFO - Test loss : 0.2348 +2024-05-31 09:31:38,614 - INFO - +Epoch 60 +2024-05-31 09:31:51,075 - INFO - [204800/436063] Loss : 0.4217 +2024-05-31 09:31:54,214 - INFO - [262144/436063] Loss : 0.4225 +2024-05-31 09:31:55,425 - INFO - Test loss : 0.2372 +2024-05-31 09:31:55,425 - INFO - +Epoch 61 +2024-05-31 09:32:08,058 - INFO - [204800/436063] Loss : 0.4212 +2024-05-31 09:32:11,331 - INFO - [262144/436063] Loss : 0.4223 +2024-05-31 09:32:12,456 - INFO - Test loss : 0.2412 +2024-05-31 09:32:12,456 - INFO - +Epoch 62 +2024-05-31 09:32:25,130 - INFO - [204800/436063] Loss : 0.4209 +2024-05-31 09:32:28,403 - INFO - [262144/436063] Loss : 0.4214 +2024-05-31 09:32:29,619 - INFO - Test loss : 0.2361 +2024-05-31 09:32:29,619 - INFO - +Epoch 63 +2024-05-31 09:32:42,176 - INFO - [204800/436063] Loss : 0.4205 +2024-05-31 09:32:45,444 - INFO - [262144/436063] Loss : 0.4211 +2024-05-31 09:32:46,656 - INFO - Test loss : 0.2419 +2024-05-31 09:32:46,656 - INFO - +Epoch 64 +2024-05-31 09:32:59,088 - INFO - [204800/436063] Loss : 0.4197 +2024-05-31 09:33:02,327 - INFO - [262144/436063] Loss : 0.4204 +2024-05-31 09:33:03,539 - INFO - Test loss : 0.2643 +2024-05-31 09:33:03,539 - INFO - +Epoch 65 +2024-05-31 09:33:16,175 - INFO - [204800/436063] Loss : 0.4203 +2024-05-31 09:33:19,354 - INFO - [262144/436063] Loss : 0.4206 +2024-05-31 09:33:20,570 - INFO - Test loss : 0.2606 +2024-05-31 09:33:20,570 - INFO - +Epoch 66 +2024-05-31 09:33:33,129 - INFO - [204800/436063] Loss : 0.4187 +2024-05-31 09:33:36,286 - INFO - [262144/436063] Loss : 0.4195 +2024-05-31 09:33:37,503 - INFO - Test loss : 0.2427 +2024-05-31 09:33:37,503 - INFO - +Epoch 67 +2024-05-31 09:33:49,709 - INFO - [204800/436063] Loss : 0.4177 +2024-05-31 09:33:52,870 - INFO - [262144/436063] Loss : 0.4183 +2024-05-31 09:33:54,000 - INFO - Test loss : 0.2540 +2024-05-31 09:33:54,002 - INFO - +Epoch 68 +2024-05-31 09:34:06,692 - INFO - [204800/436063] Loss : 0.4181 +2024-05-31 09:34:09,973 - INFO - [262144/436063] Loss : 0.4187 +2024-05-31 09:34:11,194 - INFO - Test loss : 0.2419 +2024-05-31 09:34:11,194 - INFO - +Epoch 69 +2024-05-31 09:34:23,845 - INFO - [204800/436063] Loss : 0.4171 +2024-05-31 09:34:27,156 - INFO - [262144/436063] Loss : 0.4172 +2024-05-31 09:34:28,384 - INFO - Test loss : 0.2315 +2024-05-31 09:34:28,384 - INFO - +Epoch 70 +2024-05-31 09:34:41,017 - INFO - [204800/436063] Loss : 0.4164 +2024-05-31 09:34:44,291 - INFO - [262144/436063] Loss : 0.4174 +2024-05-31 09:34:45,507 - INFO - Test loss : 0.2533 +2024-05-31 09:34:45,508 - INFO - +Epoch 71 +2024-05-31 09:34:58,189 - INFO - [204800/436063] Loss : 0.4158 +2024-05-31 09:35:01,366 - INFO - [262144/436063] Loss : 0.4165 +2024-05-31 09:35:02,577 - INFO - Test loss : 0.2513 +2024-05-31 09:35:02,577 - INFO - +Epoch 72 +2024-05-31 09:35:15,184 - INFO - [204800/436063] Loss : 0.4140 +2024-05-31 09:35:18,352 - INFO - [262144/436063] Loss : 0.4151 +2024-05-31 09:35:19,558 - INFO - Test loss : 0.2498 +2024-05-31 09:35:19,558 - INFO - +Epoch 73 +2024-05-31 09:35:32,078 - INFO - [204800/436063] Loss : 0.4144 +2024-05-31 09:35:35,318 - INFO - [262144/436063] Loss : 0.4148 +2024-05-31 09:35:36,441 - INFO - Test loss : 0.2375 +2024-05-31 09:35:36,441 - INFO - +Epoch 74 +2024-05-31 09:35:48,611 - INFO - [204800/436063] Loss : 0.4140 +2024-05-31 09:35:51,756 - INFO - [262144/436063] Loss : 0.4146 +2024-05-31 09:35:52,967 - INFO - Test loss : 0.2420 +2024-05-31 09:35:52,967 - INFO - +Epoch 75 +2024-05-31 09:36:05,116 - INFO - [204800/436063] Loss : 0.4128 +2024-05-31 09:36:08,262 - INFO - [262144/436063] Loss : 0.4138 +2024-05-31 09:36:09,492 - INFO - Test loss : 0.2455 +2024-05-31 09:36:09,495 - INFO - +Epoch 76 +2024-05-31 09:36:22,027 - INFO - [204800/436063] Loss : 0.4136 +2024-05-31 09:36:25,289 - INFO - [262144/436063] Loss : 0.4143 +2024-05-31 09:36:26,496 - INFO - Test loss : 0.2475 +2024-05-31 09:36:26,497 - INFO - +Epoch 77 +2024-05-31 09:36:39,121 - INFO - [204800/436063] Loss : 0.4126 +2024-05-31 09:36:42,299 - INFO - [262144/436063] Loss : 0.4134 +2024-05-31 09:36:43,514 - INFO - Test loss : 0.2500 +2024-05-31 09:36:43,514 - INFO - +Epoch 78 +2024-05-31 09:36:56,158 - INFO - [204800/436063] Loss : 0.4122 +2024-05-31 09:36:59,337 - INFO - [262144/436063] Loss : 0.4127 +2024-05-31 09:37:00,556 - INFO - Test loss : 0.2480 +2024-05-31 09:37:00,556 - INFO - +Epoch 79 +2024-05-31 09:37:13,180 - INFO - [204800/436063] Loss : 0.4122 +2024-05-31 09:37:16,451 - INFO - [262144/436063] Loss : 0.4129 +2024-05-31 09:37:17,582 - INFO - Test loss : 0.2538 +2024-05-31 09:37:17,583 - INFO - +Epoch 80 +2024-05-31 09:37:30,213 - INFO - [204800/436063] Loss : 0.4112 +2024-05-31 09:37:33,487 - INFO - [262144/436063] Loss : 0.4111 +2024-05-31 09:37:34,702 - INFO - Test loss : 0.2594 +2024-05-31 09:37:34,702 - INFO - +Epoch 81 +2024-05-31 09:37:47,261 - INFO - [204800/436063] Loss : 0.4100 +2024-05-31 09:37:50,537 - INFO - [262144/436063] Loss : 0.4112 +2024-05-31 09:37:51,754 - INFO - Test loss : 0.2575 +2024-05-31 09:37:51,754 - INFO - +Epoch 82 +2024-05-31 09:38:04,303 - INFO - [204800/436063] Loss : 0.4114 +2024-05-31 09:38:07,570 - INFO - [262144/436063] Loss : 0.4113 +2024-05-31 09:38:08,783 - INFO - Test loss : 0.2491 +2024-05-31 09:38:08,783 - INFO - +Epoch 83 +2024-05-31 09:38:21,405 - INFO - [204800/436063] Loss : 0.4107 +2024-05-31 09:38:24,591 - INFO - [262144/436063] Loss : 0.4110 +2024-05-31 09:38:25,812 - INFO - Test loss : 0.2557 +2024-05-31 09:38:25,812 - INFO - +Epoch 84 +2024-05-31 09:38:38,498 - INFO - [204800/436063] Loss : 0.4092 +2024-05-31 09:38:41,679 - INFO - [262144/436063] Loss : 0.4100 +2024-05-31 09:38:42,893 - INFO - Test loss : 0.2505 +2024-05-31 09:38:42,893 - INFO - +Epoch 85 +2024-05-31 09:38:55,600 - INFO - [204800/436063] Loss : 0.4091 +2024-05-31 09:38:58,901 - INFO - [262144/436063] Loss : 0.4095 +2024-05-31 09:39:00,031 - INFO - Test loss : 0.2580 +2024-05-31 09:39:00,032 - INFO - +Epoch 86 +2024-05-31 09:39:12,695 - INFO - [204800/436063] Loss : 0.4086 +2024-05-31 09:39:15,974 - INFO - [262144/436063] Loss : 0.4091 +2024-05-31 09:39:17,192 - INFO - Test loss : 0.2541 +2024-05-31 09:39:17,192 - INFO - +Epoch 87 +2024-05-31 09:39:29,714 - INFO - [204800/436063] Loss : 0.4084 +2024-05-31 09:39:32,974 - INFO - [262144/436063] Loss : 0.4090 +2024-05-31 09:39:34,185 - INFO - Test loss : 0.2608 +2024-05-31 09:39:34,185 - INFO - +Epoch 88 +2024-05-31 09:39:46,661 - INFO - [204800/436063] Loss : 0.4073 +2024-05-31 09:39:49,932 - INFO - [262144/436063] Loss : 0.4078 +2024-05-31 09:39:51,144 - INFO - Test loss : 0.2577 +2024-05-31 09:39:51,144 - INFO - +Epoch 89 +2024-05-31 09:40:03,862 - INFO - [204800/436063] Loss : 0.4072 +2024-05-31 09:40:07,041 - INFO - [262144/436063] Loss : 0.4079 +2024-05-31 09:40:08,257 - INFO - Test loss : 0.2528 +2024-05-31 09:40:08,257 - INFO - +Epoch 90 +2024-05-31 09:40:20,734 - INFO - [204800/436063] Loss : 0.4067 +2024-05-31 09:40:23,862 - INFO - [262144/436063] Loss : 0.4073 +2024-05-31 09:40:25,078 - INFO - Test loss : 0.2548 +2024-05-31 09:40:25,078 - INFO - +Epoch 91 +2024-05-31 09:40:37,274 - INFO - [204800/436063] Loss : 0.4062 +2024-05-31 09:40:40,440 - INFO - [262144/436063] Loss : 0.4070 +2024-05-31 09:40:41,566 - INFO - Test loss : 0.2484 +2024-05-31 09:40:41,568 - INFO - +Epoch 92 +2024-05-31 09:40:54,179 - INFO - [204800/436063] Loss : 0.4064 +2024-05-31 09:40:57,439 - INFO - [262144/436063] Loss : 0.4065 +2024-05-31 09:40:58,666 - INFO - Test loss : 0.2587 +2024-05-31 09:40:58,666 - INFO - +Epoch 93 +2024-05-31 09:41:11,433 - INFO - [204800/436063] Loss : 0.4057 +2024-05-31 09:41:14,739 - INFO - [262144/436063] Loss : 0.4062 +2024-05-31 09:41:15,958 - INFO - Test loss : 0.2732 +2024-05-31 09:41:15,958 - INFO - +Epoch 94 +2024-05-31 09:41:28,516 - INFO - [204800/436063] Loss : 0.4059 +2024-05-31 09:41:31,784 - INFO - [262144/436063] Loss : 0.4063 +2024-05-31 09:41:32,998 - INFO - Test loss : 0.2542 +2024-05-31 09:41:32,998 - INFO - +Epoch 95 +2024-05-31 09:41:45,661 - INFO - [204800/436063] Loss : 0.4048 +2024-05-31 09:41:48,855 - INFO - [262144/436063] Loss : 0.4055 +2024-05-31 09:41:50,077 - INFO - Test loss : 0.2680 +2024-05-31 09:41:50,077 - INFO - +Epoch 96 +2024-05-31 09:42:02,887 - INFO - [204800/436063] Loss : 0.4043 +2024-05-31 09:42:06,078 - INFO - [262144/436063] Loss : 0.4048 +2024-05-31 09:42:07,294 - INFO - Test loss : 0.2635 +2024-05-31 09:42:07,294 - INFO - +Epoch 97 +2024-05-31 09:42:20,034 - INFO - [204800/436063] Loss : 0.4045 +2024-05-31 09:42:23,305 - INFO - [262144/436063] Loss : 0.4049 +2024-05-31 09:42:24,440 - INFO - Test loss : 0.2586 +2024-05-31 09:42:24,440 - INFO - +Epoch 98 +2024-05-31 09:42:37,145 - INFO - [204800/436063] Loss : 0.4046 +2024-05-31 09:42:40,415 - INFO - [262144/436063] Loss : 0.4051 +2024-05-31 09:42:41,655 - INFO - Test loss : 0.2589 +2024-05-31 09:42:41,655 - INFO - +Epoch 99 +2024-05-31 09:42:54,538 - INFO - [204800/436063] Loss : 0.4044 +2024-05-31 09:42:57,900 - INFO - [262144/436063] Loss : 0.4043 +2024-05-31 09:42:59,164 - INFO - Test loss : 0.2659 +2024-05-31 09:42:59,164 - INFO - Saving output model to ../models/sphere2vec_dfs/model_inat_2018_Sphere2Vec-dfs_0.0100_8_0.0001000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 09:42:59,233 - INFO - Saving output model to ../models/sphere2vec_dfs/model_inat_2018_Sphere2Vec-dfs_0.0100_8_0.0001000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 09:42:59,499 - INFO - +No prior +2024-05-31 09:42:59,758 - INFO - (24426,) +2024-05-31 09:43:17,055 - INFO - Save results to ../eval_results/eval_inat_2018__val_no_prior.csv +2024-05-31 09:43:17,055 - INFO - Split ID: 0 +2024-05-31 09:43:17,056 - INFO - Top 1 acc (%): 60.2 +2024-05-31 09:43:17,056 - INFO - Top 3 acc (%): 77.9 +2024-05-31 09:43:17,056 - INFO - Top 5 acc (%): 83.29 +2024-05-31 09:43:17,056 - INFO - Top 10 acc (%): 88.5 +2024-05-31 09:43:40,504 - INFO - Split ID: 0 +2024-05-31 09:43:40,504 - INFO - Top 1 hit (%): 72.19 +2024-05-31 09:43:40,504 - INFO - Top 3 hit (%): 86.88 +2024-05-31 09:43:40,504 - INFO - Top 5 hit (%): 90.25 +2024-05-31 09:43:40,505 - INFO - Top 10 hit (%): 93.49 +2024-05-31 09:43:40,510 - INFO - +Only Sphere2Vec-dfs +2024-05-31 09:43:40,510 - INFO - Model : model_inat_2018_Sphere2Vec-dfs_0.0100_8_0.0001000_1.000_1_512_BATCH4096_leakyrelu.pth.tar +2024-05-31 09:43:41,050 - INFO - (24343,) +2024-05-31 09:44:01,100 - INFO - Split ID: 0 +2024-05-31 09:44:01,109 - INFO - Top 1 LocEnc acc (%): 0.69 +2024-05-31 09:44:01,110 - INFO - Top 3 LocEnc acc (%): 1.86 +2024-05-31 09:44:01,110 - INFO - Top 5 LocEnc acc (%): 2.92 +2024-05-31 09:44:01,111 - INFO - Top 10 LocEnc acc (%): 5.41 diff --git a/pre_trained_models/sphere2vec_dfs/model_inat_2018_Sphere2Vec-dfs_0.0100_8_0.0001000_1.000_1_512_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/sphere2vec_dfs/model_inat_2018_Sphere2Vec-dfs_0.0100_8_0.0001000_1.000_1_512_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..3b75bd07 Binary files /dev/null and b/pre_trained_models/sphere2vec_dfs/model_inat_2018_Sphere2Vec-dfs_0.0100_8_0.0001000_1.000_1_512_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/sphere2vec_dfs/model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.log b/pre_trained_models/sphere2vec_dfs/model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..e64b1947 --- /dev/null +++ b/pre_trained_models/sphere2vec_dfs/model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.log @@ -0,0 +1,639 @@ +2024-05-30 08:56:50,801 - INFO - +num_classes 555 +2024-05-30 08:56:50,801 - INFO - num train 22599 +2024-05-30 08:56:50,801 - INFO - num val 1100 +2024-05-30 08:56:50,801 - INFO - train loss full_loss +2024-05-30 08:56:50,801 - INFO - model name ../models/sphere2vec_dfs/model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.pth.tar +2024-05-30 08:56:50,801 - INFO - num users 5331 +2024-05-30 08:56:50,801 - INFO - meta data ebird_meta +2024-05-30 08:56:51,537 - INFO - +Epoch 0 +2024-05-30 08:56:53,331 - INFO - [20480/22599] Loss : 1.7545 +2024-05-30 08:56:53,383 - INFO - Test loss : 1.4960 +2024-05-30 08:56:53,383 - INFO - +Epoch 1 +2024-05-30 08:56:54,693 - INFO - [20480/22599] Loss : 1.4702 +2024-05-30 08:56:54,749 - INFO - Test loss : 0.6499 +2024-05-30 08:56:54,749 - INFO - +Epoch 2 +2024-05-30 08:56:56,003 - INFO - [20480/22599] Loss : 1.3455 +2024-05-30 08:56:56,054 - INFO - Test loss : 0.6314 +2024-05-30 08:56:56,054 - INFO - +Epoch 3 +2024-05-30 08:56:57,319 - INFO - [20480/22599] Loss : 1.2688 +2024-05-30 08:56:57,371 - INFO - Test loss : 0.4645 +2024-05-30 08:56:57,371 - INFO - +Epoch 4 +2024-05-30 08:56:58,637 - INFO - [20480/22599] Loss : 1.2234 +2024-05-30 08:56:58,689 - INFO - Test loss : 0.5243 +2024-05-30 08:56:58,689 - INFO - +Epoch 5 +2024-05-30 08:56:59,948 - INFO - [20480/22599] Loss : 1.1980 +2024-05-30 08:57:00,001 - INFO - Test loss : 0.4970 +2024-05-30 08:57:00,001 - INFO - +Epoch 6 +2024-05-30 08:57:01,264 - INFO - [20480/22599] Loss : 1.1805 +2024-05-30 08:57:01,315 - INFO - Test loss : 0.5153 +2024-05-30 08:57:01,315 - INFO - +Epoch 7 +2024-05-30 08:57:02,574 - INFO - [20480/22599] Loss : 1.1642 +2024-05-30 08:57:02,625 - INFO - Test loss : 0.4805 +2024-05-30 08:57:02,625 - INFO - +Epoch 8 +2024-05-30 08:57:03,887 - INFO - [20480/22599] Loss : 1.1535 +2024-05-30 08:57:03,938 - INFO - Test loss : 0.4929 +2024-05-30 08:57:03,938 - INFO - +Epoch 9 +2024-05-30 08:57:05,213 - INFO - [20480/22599] Loss : 1.1426 +2024-05-30 08:57:05,265 - INFO - Test loss : 0.5120 +2024-05-30 08:57:05,265 - INFO - +Epoch 10 +2024-05-30 08:57:06,534 - INFO - [20480/22599] Loss : 1.1305 +2024-05-30 08:57:06,585 - INFO - Test loss : 0.4948 +2024-05-30 08:57:06,598 - INFO - (1100,) +2024-05-30 08:57:06,643 - INFO - Split ID: 0 +2024-05-30 08:57:06,643 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 0.45 +2024-05-30 08:57:06,643 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 1.64 +2024-05-30 08:57:06,643 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 2.73 +2024-05-30 08:57:06,643 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 6.0 +2024-05-30 08:57:06,644 - INFO - +No prior +2024-05-30 08:57:06,678 - INFO - (24633,) +2024-05-30 08:57:07,342 - INFO - Split ID: 0 +2024-05-30 08:57:07,342 - INFO - Top 1 (Epoch 10)acc (%): 76.08 +2024-05-30 08:57:07,343 - INFO - Top 3 (Epoch 10)acc (%): 90.98 +2024-05-30 08:57:07,343 - INFO - Top 5 (Epoch 10)acc (%): 94.06 +2024-05-30 08:57:07,343 - INFO - Top 10 (Epoch 10)acc (%): 96.83 +2024-05-30 08:57:22,318 - INFO - Split ID: 0 +2024-05-30 08:57:22,318 - INFO - Top 1 (Epoch 10)acc (%): 79.08 +2024-05-30 08:57:22,318 - INFO - Top 3 (Epoch 10)acc (%): 92.33 +2024-05-30 08:57:22,318 - INFO - Top 5 (Epoch 10)acc (%): 95.01 +2024-05-30 08:57:22,318 - INFO - Top 10 (Epoch 10)acc (%): 97.43 +2024-05-30 08:57:22,318 - INFO - +Epoch 11 +2024-05-30 08:57:23,536 - INFO - [20480/22599] Loss : 1.1196 +2024-05-30 08:57:23,588 - INFO - Test loss : 0.4889 +2024-05-30 08:57:23,588 - INFO - +Epoch 12 +2024-05-30 08:57:24,807 - INFO - [20480/22599] Loss : 1.1103 +2024-05-30 08:57:24,869 - INFO - Test loss : 0.4945 +2024-05-30 08:57:24,869 - INFO - +Epoch 13 +2024-05-30 08:57:26,121 - INFO - [20480/22599] Loss : 1.0991 +2024-05-30 08:57:26,174 - INFO - Test loss : 0.5123 +2024-05-30 08:57:26,174 - INFO - +Epoch 14 +2024-05-30 08:57:27,416 - INFO - [20480/22599] Loss : 1.0931 +2024-05-30 08:57:27,469 - INFO - Test loss : 0.4936 +2024-05-30 08:57:27,469 - INFO - +Epoch 15 +2024-05-30 08:57:28,734 - INFO - [20480/22599] Loss : 1.0796 +2024-05-30 08:57:28,785 - INFO - Test loss : 0.4933 +2024-05-30 08:57:28,785 - INFO - +Epoch 16 +2024-05-30 08:57:30,045 - INFO - [20480/22599] Loss : 1.0723 +2024-05-30 08:57:30,097 - INFO - Test loss : 0.4853 +2024-05-30 08:57:30,097 - INFO - +Epoch 17 +2024-05-30 08:57:31,361 - INFO - [20480/22599] Loss : 1.0643 +2024-05-30 08:57:31,412 - INFO - Test loss : 0.4946 +2024-05-30 08:57:31,412 - INFO - +Epoch 18 +2024-05-30 08:57:32,758 - INFO - [20480/22599] Loss : 1.0555 +2024-05-30 08:57:32,809 - INFO - Test loss : 0.4707 +2024-05-30 08:57:32,809 - INFO - +Epoch 19 +2024-05-30 08:57:34,078 - INFO - [20480/22599] Loss : 1.0467 +2024-05-30 08:57:34,130 - INFO - Test loss : 0.4953 +2024-05-30 08:57:34,130 - INFO - +Epoch 20 +2024-05-30 08:57:35,398 - INFO - [20480/22599] Loss : 1.0403 +2024-05-30 08:57:35,449 - INFO - Test loss : 0.4894 +2024-05-30 08:57:35,456 - INFO - (1100,) +2024-05-30 08:57:35,501 - INFO - Split ID: 0 +2024-05-30 08:57:35,501 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 1.0 +2024-05-30 08:57:35,501 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 3.18 +2024-05-30 08:57:35,501 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 5.36 +2024-05-30 08:57:35,501 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 9.0 +2024-05-30 08:57:35,502 - INFO - +No prior +2024-05-30 08:57:35,530 - INFO - (24633,) +2024-05-30 08:57:36,188 - INFO - Split ID: 0 +2024-05-30 08:57:36,189 - INFO - Top 1 (Epoch 20)acc (%): 76.08 +2024-05-30 08:57:36,189 - INFO - Top 3 (Epoch 20)acc (%): 90.98 +2024-05-30 08:57:36,189 - INFO - Top 5 (Epoch 20)acc (%): 94.06 +2024-05-30 08:57:36,189 - INFO - Top 10 (Epoch 20)acc (%): 96.83 +2024-05-30 08:57:51,235 - INFO - Split ID: 0 +2024-05-30 08:57:51,235 - INFO - Top 1 (Epoch 20)acc (%): 79.76 +2024-05-30 08:57:51,236 - INFO - Top 3 (Epoch 20)acc (%): 92.77 +2024-05-30 08:57:51,236 - INFO - Top 5 (Epoch 20)acc (%): 95.39 +2024-05-30 08:57:51,236 - INFO - Top 10 (Epoch 20)acc (%): 97.63 +2024-05-30 08:57:51,236 - INFO - +Epoch 21 +2024-05-30 08:57:52,505 - INFO - [20480/22599] Loss : 1.0331 +2024-05-30 08:57:52,558 - INFO - Test loss : 0.5017 +2024-05-30 08:57:52,558 - INFO - +Epoch 22 +2024-05-30 08:57:53,844 - INFO - [20480/22599] Loss : 1.0286 +2024-05-30 08:57:53,897 - INFO - Test loss : 0.4464 +2024-05-30 08:57:53,897 - INFO - +Epoch 23 +2024-05-30 08:57:55,169 - INFO - [20480/22599] Loss : 1.0185 +2024-05-30 08:57:55,221 - INFO - Test loss : 0.4766 +2024-05-30 08:57:55,221 - INFO - +Epoch 24 +2024-05-30 08:57:56,483 - INFO - [20480/22599] Loss : 1.0121 +2024-05-30 08:57:56,535 - INFO - Test loss : 0.4795 +2024-05-30 08:57:56,535 - INFO - +Epoch 25 +2024-05-30 08:57:57,804 - INFO - [20480/22599] Loss : 1.0057 +2024-05-30 08:57:57,856 - INFO - Test loss : 0.4461 +2024-05-30 08:57:57,856 - INFO - +Epoch 26 +2024-05-30 08:57:59,127 - INFO - [20480/22599] Loss : 1.0024 +2024-05-30 08:57:59,179 - INFO - Test loss : 0.4372 +2024-05-30 08:57:59,179 - INFO - +Epoch 27 +2024-05-30 08:58:00,444 - INFO - [20480/22599] Loss : 0.9942 +2024-05-30 08:58:00,498 - INFO - Test loss : 0.4779 +2024-05-30 08:58:00,498 - INFO - +Epoch 28 +2024-05-30 08:58:01,771 - INFO - [20480/22599] Loss : 0.9903 +2024-05-30 08:58:01,823 - INFO - Test loss : 0.4778 +2024-05-30 08:58:01,823 - INFO - +Epoch 29 +2024-05-30 08:58:03,086 - INFO - [20480/22599] Loss : 0.9815 +2024-05-30 08:58:03,137 - INFO - Test loss : 0.4666 +2024-05-30 08:58:03,138 - INFO - +Epoch 30 +2024-05-30 08:58:04,399 - INFO - [20480/22599] Loss : 0.9732 +2024-05-30 08:58:04,450 - INFO - Test loss : 0.4605 +2024-05-30 08:58:04,455 - INFO - (1100,) +2024-05-30 08:58:04,500 - INFO - Split ID: 0 +2024-05-30 08:58:04,500 - INFO - Top 1 LocEnc (Epoch 30)acc (%): 1.45 +2024-05-30 08:58:04,500 - INFO - Top 3 LocEnc (Epoch 30)acc (%): 3.73 +2024-05-30 08:58:04,500 - INFO - Top 5 LocEnc (Epoch 30)acc (%): 6.82 +2024-05-30 08:58:04,501 - INFO - Top 10 LocEnc (Epoch 30)acc (%): 10.82 +2024-05-30 08:58:04,501 - INFO - +No prior +2024-05-30 08:58:04,529 - INFO - (24633,) +2024-05-30 08:58:05,183 - INFO - Split ID: 0 +2024-05-30 08:58:05,183 - INFO - Top 1 (Epoch 30)acc (%): 76.08 +2024-05-30 08:58:05,183 - INFO - Top 3 (Epoch 30)acc (%): 90.98 +2024-05-30 08:58:05,183 - INFO - Top 5 (Epoch 30)acc (%): 94.06 +2024-05-30 08:58:05,184 - INFO - Top 10 (Epoch 30)acc (%): 96.83 +2024-05-30 08:58:20,119 - INFO - Split ID: 0 +2024-05-30 08:58:20,119 - INFO - Top 1 (Epoch 30)acc (%): 80.55 +2024-05-30 08:58:20,119 - INFO - Top 3 (Epoch 30)acc (%): 93.05 +2024-05-30 08:58:20,119 - INFO - Top 5 (Epoch 30)acc (%): 95.58 +2024-05-30 08:58:20,119 - INFO - Top 10 (Epoch 30)acc (%): 97.76 +2024-05-30 08:58:20,120 - INFO - +Epoch 31 +2024-05-30 08:58:21,368 - INFO - [20480/22599] Loss : 0.9721 +2024-05-30 08:58:21,420 - INFO - Test loss : 0.4696 +2024-05-30 08:58:21,420 - INFO - +Epoch 32 +2024-05-30 08:58:22,784 - INFO - [20480/22599] Loss : 0.9678 +2024-05-30 08:58:22,836 - INFO - Test loss : 0.4649 +2024-05-30 08:58:22,836 - INFO - +Epoch 33 +2024-05-30 08:58:24,101 - INFO - [20480/22599] Loss : 0.9654 +2024-05-30 08:58:24,152 - INFO - Test loss : 0.4548 +2024-05-30 08:58:24,152 - INFO - +Epoch 34 +2024-05-30 08:58:25,409 - INFO - [20480/22599] Loss : 0.9627 +2024-05-30 08:58:25,461 - INFO - Test loss : 0.4683 +2024-05-30 08:58:25,461 - INFO - +Epoch 35 +2024-05-30 08:58:26,726 - INFO - [20480/22599] Loss : 0.9580 +2024-05-30 08:58:26,778 - INFO - Test loss : 0.4741 +2024-05-30 08:58:26,778 - INFO - +Epoch 36 +2024-05-30 08:58:28,037 - INFO - [20480/22599] Loss : 0.9549 +2024-05-30 08:58:28,089 - INFO - Test loss : 0.4737 +2024-05-30 08:58:28,089 - INFO - +Epoch 37 +2024-05-30 08:58:29,347 - INFO - [20480/22599] Loss : 0.9504 +2024-05-30 08:58:29,398 - INFO - Test loss : 0.4620 +2024-05-30 08:58:29,398 - INFO - +Epoch 38 +2024-05-30 08:58:30,659 - INFO - [20480/22599] Loss : 0.9511 +2024-05-30 08:58:30,710 - INFO - Test loss : 0.4570 +2024-05-30 08:58:30,710 - INFO - +Epoch 39 +2024-05-30 08:58:31,973 - INFO - [20480/22599] Loss : 0.9459 +2024-05-30 08:58:32,024 - INFO - Test loss : 0.4587 +2024-05-30 08:58:32,025 - INFO - +Epoch 40 +2024-05-30 08:58:33,295 - INFO - [20480/22599] Loss : 0.9441 +2024-05-30 08:58:33,346 - INFO - Test loss : 0.4572 +2024-05-30 08:58:33,352 - INFO - (1100,) +2024-05-30 08:58:33,397 - INFO - Split ID: 0 +2024-05-30 08:58:33,398 - INFO - Top 1 LocEnc (Epoch 40)acc (%): 1.45 +2024-05-30 08:58:33,398 - INFO - Top 3 LocEnc (Epoch 40)acc (%): 4.27 +2024-05-30 08:58:33,398 - INFO - Top 5 LocEnc (Epoch 40)acc (%): 6.55 +2024-05-30 08:58:33,398 - INFO - Top 10 LocEnc (Epoch 40)acc (%): 11.18 +2024-05-30 08:58:33,398 - INFO - +No prior +2024-05-30 08:58:33,427 - INFO - (24633,) +2024-05-30 08:58:34,079 - INFO - Split ID: 0 +2024-05-30 08:58:34,080 - INFO - Top 1 (Epoch 40)acc (%): 76.08 +2024-05-30 08:58:34,080 - INFO - Top 3 (Epoch 40)acc (%): 90.98 +2024-05-30 08:58:34,080 - INFO - Top 5 (Epoch 40)acc (%): 94.06 +2024-05-30 08:58:34,080 - INFO - Top 10 (Epoch 40)acc (%): 96.83 +2024-05-30 08:58:49,034 - INFO - Split ID: 0 +2024-05-30 08:58:49,035 - INFO - Top 1 (Epoch 40)acc (%): 80.9 +2024-05-30 08:58:49,035 - INFO - Top 3 (Epoch 40)acc (%): 93.23 +2024-05-30 08:58:49,035 - INFO - Top 5 (Epoch 40)acc (%): 95.72 +2024-05-30 08:58:49,035 - INFO - Top 10 (Epoch 40)acc (%): 97.78 +2024-05-30 08:58:49,035 - INFO - +Epoch 41 +2024-05-30 08:58:50,302 - INFO - [20480/22599] Loss : 0.9409 +2024-05-30 08:58:50,355 - INFO - Test loss : 0.4628 +2024-05-30 08:58:50,355 - INFO - +Epoch 42 +2024-05-30 08:58:51,621 - INFO - [20480/22599] Loss : 0.9409 +2024-05-30 08:58:51,673 - INFO - Test loss : 0.4641 +2024-05-30 08:58:51,673 - INFO - +Epoch 43 +2024-05-30 08:58:52,936 - INFO - [20480/22599] Loss : 0.9377 +2024-05-30 08:58:52,988 - INFO - Test loss : 0.4417 +2024-05-30 08:58:52,988 - INFO - +Epoch 44 +2024-05-30 08:58:54,254 - INFO - [20480/22599] Loss : 0.9377 +2024-05-30 08:58:54,305 - INFO - Test loss : 0.4483 +2024-05-30 08:58:54,306 - INFO - +Epoch 45 +2024-05-30 08:58:55,569 - INFO - [20480/22599] Loss : 0.9364 +2024-05-30 08:58:55,621 - INFO - Test loss : 0.4545 +2024-05-30 08:58:55,621 - INFO - +Epoch 46 +2024-05-30 08:58:56,959 - INFO - [20480/22599] Loss : 0.9336 +2024-05-30 08:58:57,013 - INFO - Test loss : 0.4740 +2024-05-30 08:58:57,013 - INFO - +Epoch 47 +2024-05-30 08:58:58,295 - INFO - [20480/22599] Loss : 0.9307 +2024-05-30 08:58:58,346 - INFO - Test loss : 0.4764 +2024-05-30 08:58:58,346 - INFO - +Epoch 48 +2024-05-30 08:58:59,615 - INFO - [20480/22599] Loss : 0.9285 +2024-05-30 08:58:59,667 - INFO - Test loss : 0.4557 +2024-05-30 08:58:59,667 - INFO - +Epoch 49 +2024-05-30 08:59:00,943 - INFO - [20480/22599] Loss : 0.9294 +2024-05-30 08:59:00,996 - INFO - Test loss : 0.4599 +2024-05-30 08:59:00,997 - INFO - +Epoch 50 +2024-05-30 08:59:02,263 - INFO - [20480/22599] Loss : 0.9288 +2024-05-30 08:59:02,315 - INFO - Test loss : 0.4608 +2024-05-30 08:59:02,320 - INFO - (1100,) +2024-05-30 08:59:02,365 - INFO - Split ID: 0 +2024-05-30 08:59:02,366 - INFO - Top 1 LocEnc (Epoch 50)acc (%): 2.0 +2024-05-30 08:59:02,366 - INFO - Top 3 LocEnc (Epoch 50)acc (%): 4.91 +2024-05-30 08:59:02,366 - INFO - Top 5 LocEnc (Epoch 50)acc (%): 7.73 +2024-05-30 08:59:02,366 - INFO - Top 10 LocEnc (Epoch 50)acc (%): 12.64 +2024-05-30 08:59:02,367 - INFO - +No prior +2024-05-30 08:59:02,395 - INFO - (24633,) +2024-05-30 08:59:03,056 - INFO - Split ID: 0 +2024-05-30 08:59:03,056 - INFO - Top 1 (Epoch 50)acc (%): 76.08 +2024-05-30 08:59:03,056 - INFO - Top 3 (Epoch 50)acc (%): 90.98 +2024-05-30 08:59:03,056 - INFO - Top 5 (Epoch 50)acc (%): 94.06 +2024-05-30 08:59:03,056 - INFO - Top 10 (Epoch 50)acc (%): 96.83 +2024-05-30 08:59:18,035 - INFO - Split ID: 0 +2024-05-30 08:59:18,035 - INFO - Top 1 (Epoch 50)acc (%): 81.04 +2024-05-30 08:59:18,037 - INFO - Top 3 (Epoch 50)acc (%): 93.32 +2024-05-30 08:59:18,037 - INFO - Top 5 (Epoch 50)acc (%): 95.78 +2024-05-30 08:59:18,037 - INFO - Top 10 (Epoch 50)acc (%): 97.84 +2024-05-30 08:59:18,038 - INFO - +Epoch 51 +2024-05-30 08:59:19,302 - INFO - [20480/22599] Loss : 0.9228 +2024-05-30 08:59:19,354 - INFO - Test loss : 0.4821 +2024-05-30 08:59:19,354 - INFO - +Epoch 52 +2024-05-30 08:59:20,627 - INFO - [20480/22599] Loss : 0.9243 +2024-05-30 08:59:20,686 - INFO - Test loss : 0.4746 +2024-05-30 08:59:20,687 - INFO - +Epoch 53 +2024-05-30 08:59:21,958 - INFO - [20480/22599] Loss : 0.9206 +2024-05-30 08:59:22,010 - INFO - Test loss : 0.4737 +2024-05-30 08:59:22,010 - INFO - +Epoch 54 +2024-05-30 08:59:23,274 - INFO - [20480/22599] Loss : 0.9203 +2024-05-30 08:59:23,326 - INFO - Test loss : 0.4721 +2024-05-30 08:59:23,326 - INFO - +Epoch 55 +2024-05-30 08:59:24,591 - INFO - [20480/22599] Loss : 0.9210 +2024-05-30 08:59:24,643 - INFO - Test loss : 0.4675 +2024-05-30 08:59:24,643 - INFO - +Epoch 56 +2024-05-30 08:59:25,900 - INFO - [20480/22599] Loss : 0.9187 +2024-05-30 08:59:25,951 - INFO - Test loss : 0.4638 +2024-05-30 08:59:25,952 - INFO - +Epoch 57 +2024-05-30 08:59:27,222 - INFO - [20480/22599] Loss : 0.9187 +2024-05-30 08:59:27,275 - INFO - Test loss : 0.4713 +2024-05-30 08:59:27,275 - INFO - +Epoch 58 +2024-05-30 08:59:28,560 - INFO - [20480/22599] Loss : 0.9156 +2024-05-30 08:59:28,612 - INFO - Test loss : 0.4772 +2024-05-30 08:59:28,612 - INFO - +Epoch 59 +2024-05-30 08:59:29,882 - INFO - [20480/22599] Loss : 0.9142 +2024-05-30 08:59:29,951 - INFO - Test loss : 0.4821 +2024-05-30 08:59:29,951 - INFO - +Epoch 60 +2024-05-30 08:59:31,289 - INFO - [20480/22599] Loss : 0.9114 +2024-05-30 08:59:31,341 - INFO - Test loss : 0.4817 +2024-05-30 08:59:31,346 - INFO - (1100,) +2024-05-30 08:59:31,392 - INFO - Split ID: 0 +2024-05-30 08:59:31,393 - INFO - Top 1 LocEnc (Epoch 60)acc (%): 1.45 +2024-05-30 08:59:31,393 - INFO - Top 3 LocEnc (Epoch 60)acc (%): 5.0 +2024-05-30 08:59:31,393 - INFO - Top 5 LocEnc (Epoch 60)acc (%): 6.82 +2024-05-30 08:59:31,393 - INFO - Top 10 LocEnc (Epoch 60)acc (%): 13.91 +2024-05-30 08:59:31,393 - INFO - +No prior +2024-05-30 08:59:31,421 - INFO - (24633,) +2024-05-30 08:59:32,083 - INFO - Split ID: 0 +2024-05-30 08:59:32,083 - INFO - Top 1 (Epoch 60)acc (%): 76.08 +2024-05-30 08:59:32,083 - INFO - Top 3 (Epoch 60)acc (%): 90.98 +2024-05-30 08:59:32,083 - INFO - Top 5 (Epoch 60)acc (%): 94.06 +2024-05-30 08:59:32,083 - INFO - Top 10 (Epoch 60)acc (%): 96.83 +2024-05-30 08:59:47,116 - INFO - Split ID: 0 +2024-05-30 08:59:47,116 - INFO - Top 1 (Epoch 60)acc (%): 81.25 +2024-05-30 08:59:47,116 - INFO - Top 3 (Epoch 60)acc (%): 93.35 +2024-05-30 08:59:47,117 - INFO - Top 5 (Epoch 60)acc (%): 95.73 +2024-05-30 08:59:47,117 - INFO - Top 10 (Epoch 60)acc (%): 97.85 +2024-05-30 08:59:47,117 - INFO - +Epoch 61 +2024-05-30 08:59:48,390 - INFO - [20480/22599] Loss : 0.9112 +2024-05-30 08:59:48,441 - INFO - Test loss : 0.4698 +2024-05-30 08:59:48,442 - INFO - +Epoch 62 +2024-05-30 08:59:49,730 - INFO - [20480/22599] Loss : 0.9121 +2024-05-30 08:59:49,783 - INFO - Test loss : 0.4708 +2024-05-30 08:59:49,783 - INFO - +Epoch 63 +2024-05-30 08:59:51,059 - INFO - [20480/22599] Loss : 0.9106 +2024-05-30 08:59:51,111 - INFO - Test loss : 0.4759 +2024-05-30 08:59:51,111 - INFO - +Epoch 64 +2024-05-30 08:59:52,381 - INFO - [20480/22599] Loss : 0.9087 +2024-05-30 08:59:52,433 - INFO - Test loss : 0.4835 +2024-05-30 08:59:52,433 - INFO - +Epoch 65 +2024-05-30 08:59:53,707 - INFO - [20480/22599] Loss : 0.9066 +2024-05-30 08:59:53,758 - INFO - Test loss : 0.4828 +2024-05-30 08:59:53,758 - INFO - +Epoch 66 +2024-05-30 08:59:55,020 - INFO - [20480/22599] Loss : 0.9048 +2024-05-30 08:59:55,086 - INFO - Test loss : 0.4914 +2024-05-30 08:59:55,086 - INFO - +Epoch 67 +2024-05-30 08:59:56,349 - INFO - [20480/22599] Loss : 0.9055 +2024-05-30 08:59:56,401 - INFO - Test loss : 0.4730 +2024-05-30 08:59:56,401 - INFO - +Epoch 68 +2024-05-30 08:59:57,695 - INFO - [20480/22599] Loss : 0.9033 +2024-05-30 08:59:57,748 - INFO - Test loss : 0.4872 +2024-05-30 08:59:57,748 - INFO - +Epoch 69 +2024-05-30 08:59:59,017 - INFO - [20480/22599] Loss : 0.9028 +2024-05-30 08:59:59,069 - INFO - Test loss : 0.4823 +2024-05-30 08:59:59,069 - INFO - +Epoch 70 +2024-05-30 09:00:00,332 - INFO - [20480/22599] Loss : 0.9047 +2024-05-30 09:00:00,384 - INFO - Test loss : 0.4872 +2024-05-30 09:00:00,389 - INFO - (1100,) +2024-05-30 09:00:00,434 - INFO - Split ID: 0 +2024-05-30 09:00:00,435 - INFO - Top 1 LocEnc (Epoch 70)acc (%): 2.09 +2024-05-30 09:00:00,435 - INFO - Top 3 LocEnc (Epoch 70)acc (%): 4.73 +2024-05-30 09:00:00,435 - INFO - Top 5 LocEnc (Epoch 70)acc (%): 7.36 +2024-05-30 09:00:00,435 - INFO - Top 10 LocEnc (Epoch 70)acc (%): 13.0 +2024-05-30 09:00:00,436 - INFO - +No prior +2024-05-30 09:00:00,463 - INFO - (24633,) +2024-05-30 09:00:01,118 - INFO - Split ID: 0 +2024-05-30 09:00:01,118 - INFO - Top 1 (Epoch 70)acc (%): 76.08 +2024-05-30 09:00:01,118 - INFO - Top 3 (Epoch 70)acc (%): 90.98 +2024-05-30 09:00:01,118 - INFO - Top 5 (Epoch 70)acc (%): 94.06 +2024-05-30 09:00:01,118 - INFO - Top 10 (Epoch 70)acc (%): 96.83 +2024-05-30 09:00:16,155 - INFO - Split ID: 0 +2024-05-30 09:00:16,155 - INFO - Top 1 (Epoch 70)acc (%): 81.22 +2024-05-30 09:00:16,155 - INFO - Top 3 (Epoch 70)acc (%): 93.38 +2024-05-30 09:00:16,155 - INFO - Top 5 (Epoch 70)acc (%): 95.72 +2024-05-30 09:00:16,155 - INFO - Top 10 (Epoch 70)acc (%): 97.87 +2024-05-30 09:00:16,156 - INFO - +Epoch 71 +2024-05-30 09:00:17,412 - INFO - [20480/22599] Loss : 0.9045 +2024-05-30 09:00:17,471 - INFO - Test loss : 0.4760 +2024-05-30 09:00:17,471 - INFO - +Epoch 72 +2024-05-30 09:00:18,740 - INFO - [20480/22599] Loss : 0.9012 +2024-05-30 09:00:18,792 - INFO - Test loss : 0.4860 +2024-05-30 09:00:18,792 - INFO - +Epoch 73 +2024-05-30 09:00:20,058 - INFO - [20480/22599] Loss : 0.9033 +2024-05-30 09:00:20,110 - INFO - Test loss : 0.4819 +2024-05-30 09:00:20,110 - INFO - +Epoch 74 +2024-05-30 09:00:21,459 - INFO - [20480/22599] Loss : 0.8990 +2024-05-30 09:00:21,512 - INFO - Test loss : 0.4814 +2024-05-30 09:00:21,512 - INFO - +Epoch 75 +2024-05-30 09:00:22,787 - INFO - [20480/22599] Loss : 0.9008 +2024-05-30 09:00:22,839 - INFO - Test loss : 0.4770 +2024-05-30 09:00:22,839 - INFO - +Epoch 76 +2024-05-30 09:00:24,110 - INFO - [20480/22599] Loss : 0.8983 +2024-05-30 09:00:24,161 - INFO - Test loss : 0.4956 +2024-05-30 09:00:24,161 - INFO - +Epoch 77 +2024-05-30 09:00:25,435 - INFO - [20480/22599] Loss : 0.8990 +2024-05-30 09:00:25,487 - INFO - Test loss : 0.4861 +2024-05-30 09:00:25,487 - INFO - +Epoch 78 +2024-05-30 09:00:26,758 - INFO - [20480/22599] Loss : 0.8963 +2024-05-30 09:00:26,810 - INFO - Test loss : 0.4913 +2024-05-30 09:00:26,810 - INFO - +Epoch 79 +2024-05-30 09:00:28,081 - INFO - [20480/22599] Loss : 0.8942 +2024-05-30 09:00:28,138 - INFO - Test loss : 0.4948 +2024-05-30 09:00:28,138 - INFO - +Epoch 80 +2024-05-30 09:00:29,409 - INFO - [20480/22599] Loss : 0.8948 +2024-05-30 09:00:29,461 - INFO - Test loss : 0.4926 +2024-05-30 09:00:29,467 - INFO - (1100,) +2024-05-30 09:00:29,512 - INFO - Split ID: 0 +2024-05-30 09:00:29,513 - INFO - Top 1 LocEnc (Epoch 80)acc (%): 2.0 +2024-05-30 09:00:29,513 - INFO - Top 3 LocEnc (Epoch 80)acc (%): 5.27 +2024-05-30 09:00:29,513 - INFO - Top 5 LocEnc (Epoch 80)acc (%): 8.09 +2024-05-30 09:00:29,513 - INFO - Top 10 LocEnc (Epoch 80)acc (%): 13.73 +2024-05-30 09:00:29,514 - INFO - +No prior +2024-05-30 09:00:29,541 - INFO - (24633,) +2024-05-30 09:00:30,197 - INFO - Split ID: 0 +2024-05-30 09:00:30,197 - INFO - Top 1 (Epoch 80)acc (%): 76.08 +2024-05-30 09:00:30,197 - INFO - Top 3 (Epoch 80)acc (%): 90.98 +2024-05-30 09:00:30,197 - INFO - Top 5 (Epoch 80)acc (%): 94.06 +2024-05-30 09:00:30,197 - INFO - Top 10 (Epoch 80)acc (%): 96.83 +2024-05-30 09:00:45,201 - INFO - Split ID: 0 +2024-05-30 09:00:45,201 - INFO - Top 1 (Epoch 80)acc (%): 81.42 +2024-05-30 09:00:45,201 - INFO - Top 3 (Epoch 80)acc (%): 93.43 +2024-05-30 09:00:45,201 - INFO - Top 5 (Epoch 80)acc (%): 95.73 +2024-05-30 09:00:45,201 - INFO - Top 10 (Epoch 80)acc (%): 97.84 +2024-05-30 09:00:45,202 - INFO - +Epoch 81 +2024-05-30 09:00:46,469 - INFO - [20480/22599] Loss : 0.8971 +2024-05-30 09:00:46,521 - INFO - Test loss : 0.4850 +2024-05-30 09:00:46,521 - INFO - +Epoch 82 +2024-05-30 09:00:47,806 - INFO - [20480/22599] Loss : 0.8930 +2024-05-30 09:00:47,857 - INFO - Test loss : 0.4835 +2024-05-30 09:00:47,857 - INFO - +Epoch 83 +2024-05-30 09:00:49,126 - INFO - [20480/22599] Loss : 0.8921 +2024-05-30 09:00:49,178 - INFO - Test loss : 0.4924 +2024-05-30 09:00:49,178 - INFO - +Epoch 84 +2024-05-30 09:00:50,444 - INFO - [20480/22599] Loss : 0.8914 +2024-05-30 09:00:50,495 - INFO - Test loss : 0.4950 +2024-05-30 09:00:50,496 - INFO - +Epoch 85 +2024-05-30 09:00:51,770 - INFO - [20480/22599] Loss : 0.8902 +2024-05-30 09:00:51,821 - INFO - Test loss : 0.4869 +2024-05-30 09:00:51,821 - INFO - +Epoch 86 +2024-05-30 09:00:53,088 - INFO - [20480/22599] Loss : 0.8919 +2024-05-30 09:00:53,140 - INFO - Test loss : 0.4933 +2024-05-30 09:00:53,140 - INFO - +Epoch 87 +2024-05-30 09:00:54,398 - INFO - [20480/22599] Loss : 0.8908 +2024-05-30 09:00:54,450 - INFO - Test loss : 0.5076 +2024-05-30 09:00:54,450 - INFO - +Epoch 88 +2024-05-30 09:00:55,778 - INFO - [20480/22599] Loss : 0.8923 +2024-05-30 09:00:55,834 - INFO - Test loss : 0.4956 +2024-05-30 09:00:55,834 - INFO - +Epoch 89 +2024-05-30 09:00:57,093 - INFO - [20480/22599] Loss : 0.8881 +2024-05-30 09:00:57,155 - INFO - Test loss : 0.4951 +2024-05-30 09:00:57,156 - INFO - +Epoch 90 +2024-05-30 09:00:58,416 - INFO - [20480/22599] Loss : 0.8903 +2024-05-30 09:00:58,468 - INFO - Test loss : 0.5063 +2024-05-30 09:00:58,473 - INFO - (1100,) +2024-05-30 09:00:58,519 - INFO - Split ID: 0 +2024-05-30 09:00:58,520 - INFO - Top 1 LocEnc (Epoch 90)acc (%): 1.91 +2024-05-30 09:00:58,520 - INFO - Top 3 LocEnc (Epoch 90)acc (%): 4.45 +2024-05-30 09:00:58,520 - INFO - Top 5 LocEnc (Epoch 90)acc (%): 7.73 +2024-05-30 09:00:58,520 - INFO - Top 10 LocEnc (Epoch 90)acc (%): 13.18 +2024-05-30 09:00:58,520 - INFO - +No prior +2024-05-30 09:00:58,548 - INFO - (24633,) +2024-05-30 09:00:59,209 - INFO - Split ID: 0 +2024-05-30 09:00:59,209 - INFO - Top 1 (Epoch 90)acc (%): 76.08 +2024-05-30 09:00:59,209 - INFO - Top 3 (Epoch 90)acc (%): 90.98 +2024-05-30 09:00:59,210 - INFO - Top 5 (Epoch 90)acc (%): 94.06 +2024-05-30 09:00:59,210 - INFO - Top 10 (Epoch 90)acc (%): 96.83 +2024-05-30 09:01:14,286 - INFO - Split ID: 0 +2024-05-30 09:01:14,286 - INFO - Top 1 (Epoch 90)acc (%): 81.35 +2024-05-30 09:01:14,286 - INFO - Top 3 (Epoch 90)acc (%): 93.41 +2024-05-30 09:01:14,286 - INFO - Top 5 (Epoch 90)acc (%): 95.73 +2024-05-30 09:01:14,286 - INFO - Top 10 (Epoch 90)acc (%): 97.82 +2024-05-30 09:01:14,287 - INFO - +Epoch 91 +2024-05-30 09:01:15,570 - INFO - [20480/22599] Loss : 0.8883 +2024-05-30 09:01:15,622 - INFO - Test loss : 0.5004 +2024-05-30 09:01:15,622 - INFO - +Epoch 92 +2024-05-30 09:01:16,886 - INFO - [20480/22599] Loss : 0.8884 +2024-05-30 09:01:16,938 - INFO - Test loss : 0.4955 +2024-05-30 09:01:16,938 - INFO - +Epoch 93 +2024-05-30 09:01:18,205 - INFO - [20480/22599] Loss : 0.8858 +2024-05-30 09:01:18,257 - INFO - Test loss : 0.4984 +2024-05-30 09:01:18,257 - INFO - +Epoch 94 +2024-05-30 09:01:19,514 - INFO - [20480/22599] Loss : 0.8845 +2024-05-30 09:01:19,566 - INFO - Test loss : 0.5018 +2024-05-30 09:01:19,566 - INFO - +Epoch 95 +2024-05-30 09:01:20,820 - INFO - [20480/22599] Loss : 0.8841 +2024-05-30 09:01:20,871 - INFO - Test loss : 0.5095 +2024-05-30 09:01:20,872 - INFO - +Epoch 96 +2024-05-30 09:01:22,148 - INFO - [20480/22599] Loss : 0.8859 +2024-05-30 09:01:22,201 - INFO - Test loss : 0.5037 +2024-05-30 09:01:22,201 - INFO - +Epoch 97 +2024-05-30 09:01:23,466 - INFO - [20480/22599] Loss : 0.8838 +2024-05-30 09:01:23,517 - INFO - Test loss : 0.5053 +2024-05-30 09:01:23,518 - INFO - +Epoch 98 +2024-05-30 09:01:24,778 - INFO - [20480/22599] Loss : 0.8843 +2024-05-30 09:01:24,830 - INFO - Test loss : 0.5032 +2024-05-30 09:01:24,830 - INFO - +Epoch 99 +2024-05-30 09:01:26,092 - INFO - [20480/22599] Loss : 0.8842 +2024-05-30 09:01:26,145 - INFO - Test loss : 0.5051 +2024-05-30 09:01:26,145 - INFO - Saving output model to ../models/sphere2vec_dfs/model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.pth.tar +2024-05-30 09:01:26,163 - INFO - Saving output model to ../models/sphere2vec_dfs/model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.pth.tar +2024-05-30 09:01:26,212 - INFO - +No prior +2024-05-30 09:01:26,243 - INFO - (24633,) +2024-05-30 09:01:26,897 - INFO - Split ID: 0 +2024-05-30 09:01:26,898 - INFO - Top 1 acc (%): 76.08 +2024-05-30 09:01:26,898 - INFO - Top 3 acc (%): 90.98 +2024-05-30 09:01:26,898 - INFO - Top 5 acc (%): 94.06 +2024-05-30 09:01:26,898 - INFO - Top 10 acc (%): 96.83 +2024-05-30 09:01:41,958 - INFO - Split ID: 0 +2024-05-30 09:01:41,958 - INFO - Top 1 acc (%): 81.44 +2024-05-30 09:01:41,958 - INFO - Top 3 acc (%): 93.46 +2024-05-30 09:01:41,958 - INFO - Top 5 acc (%): 95.74 +2024-05-30 09:01:41,958 - INFO - Top 10 acc (%): 97.84 +2024-05-30 09:01:41,959 - INFO - +Sphere2Vec-dfs +2024-05-30 09:01:41,959 - INFO - Model : model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.pth.tar +2024-05-30 09:01:41,974 - INFO - (1100,) +2024-05-30 09:01:42,019 - INFO - Split ID: 0 +2024-05-30 09:01:42,020 - INFO - Top 1 LocEnc acc (%): 2.0 +2024-05-30 09:01:42,020 - INFO - Top 3 LocEnc acc (%): 5.36 +2024-05-30 09:01:42,020 - INFO - Top 5 LocEnc acc (%): 8.45 +2024-05-30 09:01:42,020 - INFO - Top 10 LocEnc acc (%): 14.45 +2024-05-31 03:58:35,359 - INFO - +num_classes 555 +2024-05-31 03:58:35,359 - INFO - num train 22599 +2024-05-31 03:58:35,359 - INFO - num val 1100 +2024-05-31 03:58:35,359 - INFO - train loss full_loss +2024-05-31 03:58:35,360 - INFO - model name ../models/sphere2vec_dfs/model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:58:35,360 - INFO - num users 5331 +2024-05-31 03:58:35,360 - INFO - meta data ebird_meta +2024-05-31 03:58:36,121 - INFO - +Only Sphere2Vec-dfs +2024-05-31 03:58:36,121 - INFO - Model : model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:58:36,208 - INFO - Saving output model to ../models/sphere2vec_dfs/model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:58:36,261 - INFO - Saving output model to ../models/sphere2vec_dfs/model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:58:36,302 - INFO - +No prior +2024-05-31 03:58:36,338 - INFO - (24633,) +2024-05-31 03:58:37,816 - INFO - Save results to ../eval_results/eval_nabirds_ebird_meta_test_no_prior.csv +2024-05-31 03:58:37,816 - INFO - Split ID: 0 +2024-05-31 03:58:37,816 - INFO - Top 1 acc (%): 76.08 +2024-05-31 03:58:37,816 - INFO - Top 3 acc (%): 90.98 +2024-05-31 03:58:37,816 - INFO - Top 5 acc (%): 94.06 +2024-05-31 03:58:37,816 - INFO - Top 10 acc (%): 96.83 +2024-05-31 03:58:54,591 - INFO - Split ID: 0 +2024-05-31 03:58:54,591 - INFO - Top 1 hit (%): 81.44 +2024-05-31 03:58:54,591 - INFO - Top 3 hit (%): 93.46 +2024-05-31 03:58:54,591 - INFO - Top 5 hit (%): 95.74 +2024-05-31 03:58:54,591 - INFO - Top 10 hit (%): 97.84 +2024-05-31 03:58:54,597 - INFO - +Only Sphere2Vec-dfs +2024-05-31 03:58:54,597 - INFO - Model : model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.pth.tar +2024-05-31 03:58:54,639 - INFO - (1100,) +2024-05-31 03:58:54,690 - INFO - Split ID: 0 +2024-05-31 03:58:54,691 - INFO - Top 1 LocEnc acc (%): 2.0 +2024-05-31 03:58:54,691 - INFO - Top 3 LocEnc acc (%): 5.36 +2024-05-31 03:58:54,691 - INFO - Top 5 LocEnc acc (%): 8.45 +2024-05-31 03:58:54,691 - INFO - Top 10 LocEnc acc (%): 14.45 diff --git a/pre_trained_models/sphere2vec_dfs/model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/sphere2vec_dfs/model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..d779a2bf Binary files /dev/null and b/pre_trained_models/sphere2vec_dfs/model_nabirds_ebird_meta_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_256_BATCH4096_leakyrelu.pth.tar differ diff --git a/pre_trained_models/sphere2vec_dfs/model_yfcc_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_512_BATCH4096_leakyrelu.log b/pre_trained_models/sphere2vec_dfs/model_yfcc_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_512_BATCH4096_leakyrelu.log new file mode 100755 index 00000000..c32863e4 --- /dev/null +++ b/pre_trained_models/sphere2vec_dfs/model_yfcc_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_512_BATCH4096_leakyrelu.log @@ -0,0 +1,602 @@ +2024-05-30 18:20:21,682 - INFO - +num_classes 100 +2024-05-30 18:20:21,682 - INFO - num train 66739 +2024-05-30 18:20:21,682 - INFO - num val 4449 +2024-05-30 18:20:21,682 - INFO - train loss full_loss +2024-05-30 18:20:21,683 - INFO - model name ../models/sphere2vec_dfs/model_yfcc_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-30 18:20:21,683 - INFO - num users 1 +2024-05-30 18:20:22,442 - INFO - +Epoch 0 +2024-05-30 18:20:34,246 - INFO - [65536/66739] Loss : 1.8735 +2024-05-30 18:20:36,326 - INFO - Test loss : 0.5131 +2024-05-30 18:20:36,326 - INFO - +Epoch 1 +2024-05-30 18:20:48,958 - INFO - [65536/66739] Loss : 1.3218 +2024-05-30 18:20:50,631 - INFO - Test loss : 0.5169 +2024-05-30 18:20:50,631 - INFO - +Epoch 2 +2024-05-30 18:21:01,777 - INFO - [65536/66739] Loss : 1.2888 +2024-05-30 18:21:02,701 - INFO - Test loss : 0.5769 +2024-05-30 18:21:02,701 - INFO - +Epoch 3 +2024-05-30 18:21:15,068 - INFO - [65536/66739] Loss : 1.2712 +2024-05-30 18:21:15,296 - INFO - Test loss : 0.5743 +2024-05-30 18:21:15,296 - INFO - +Epoch 4 +2024-05-30 18:21:27,982 - INFO - [65536/66739] Loss : 1.2627 +2024-05-30 18:21:28,301 - INFO - Test loss : 0.5784 +2024-05-30 18:21:28,301 - INFO - +Epoch 5 +2024-05-30 18:21:39,829 - INFO - [65536/66739] Loss : 1.2572 +2024-05-30 18:21:41,343 - INFO - Test loss : 0.5875 +2024-05-30 18:21:41,343 - INFO - +Epoch 6 +2024-05-30 18:21:55,701 - INFO - [65536/66739] Loss : 1.2536 +2024-05-30 18:21:55,938 - INFO - Test loss : 0.5821 +2024-05-30 18:21:55,938 - INFO - +Epoch 7 +2024-05-30 18:22:08,848 - INFO - [65536/66739] Loss : 1.2514 +2024-05-30 18:22:09,079 - INFO - Test loss : 0.5734 +2024-05-30 18:22:09,079 - INFO - +Epoch 8 +2024-05-30 18:22:20,482 - INFO - [65536/66739] Loss : 1.2491 +2024-05-30 18:22:21,192 - INFO - Test loss : 0.5648 +2024-05-30 18:22:21,192 - INFO - +Epoch 9 +2024-05-30 18:22:33,199 - INFO - [65536/66739] Loss : 1.2473 +2024-05-30 18:22:34,089 - INFO - Test loss : 0.5730 +2024-05-30 18:22:34,089 - INFO - +Epoch 10 +2024-05-30 18:22:45,258 - INFO - [65536/66739] Loss : 1.2458 +2024-05-30 18:22:46,785 - INFO - Test loss : 0.5818 +2024-05-30 18:22:46,805 - INFO - (4449,) +2024-05-30 18:22:46,820 - INFO - Split ID: 0 +2024-05-30 18:22:46,822 - INFO - Top 1 LocEnc (Epoch 10)acc (%): 3.19 +2024-05-30 18:22:46,822 - INFO - Top 3 LocEnc (Epoch 10)acc (%): 9.26 +2024-05-30 18:22:46,822 - INFO - Top 5 LocEnc (Epoch 10)acc (%): 14.59 +2024-05-30 18:22:46,822 - INFO - Top 10 LocEnc (Epoch 10)acc (%): 25.24 +2024-05-30 18:22:46,823 - INFO - +No prior +2024-05-30 18:22:46,825 - INFO - (17798,) +2024-05-30 18:22:46,916 - INFO - Split ID: 0 +2024-05-30 18:22:46,916 - INFO - Top 1 (Epoch 10)acc (%): 50.15 +2024-05-30 18:22:46,917 - INFO - Top 3 (Epoch 10)acc (%): 73.9 +2024-05-30 18:22:46,917 - INFO - Top 5 (Epoch 10)acc (%): 82.45 +2024-05-30 18:22:46,917 - INFO - Top 10 (Epoch 10)acc (%): 91.06 +2024-05-30 18:23:01,002 - INFO - Split ID: 0 +2024-05-30 18:23:01,002 - INFO - Top 1 (Epoch 10)acc (%): 50.15 +2024-05-30 18:23:01,003 - INFO - Top 3 (Epoch 10)acc (%): 74.02 +2024-05-30 18:23:01,003 - INFO - Top 5 (Epoch 10)acc (%): 82.63 +2024-05-30 18:23:01,003 - INFO - Top 10 (Epoch 10)acc (%): 91.0 +2024-05-30 18:23:01,003 - INFO - +Epoch 11 +2024-05-30 18:23:13,608 - INFO - [65536/66739] Loss : 1.2447 +2024-05-30 18:23:13,850 - INFO - Test loss : 0.5753 +2024-05-30 18:23:13,850 - INFO - +Epoch 12 +2024-05-30 18:23:26,855 - INFO - [65536/66739] Loss : 1.2433 +2024-05-30 18:23:27,097 - INFO - Test loss : 0.5634 +2024-05-30 18:23:27,097 - INFO - +Epoch 13 +2024-05-30 18:23:39,377 - INFO - [65536/66739] Loss : 1.2409 +2024-05-30 18:23:39,603 - INFO - Test loss : 0.5591 +2024-05-30 18:23:39,603 - INFO - +Epoch 14 +2024-05-30 18:23:51,391 - INFO - [65536/66739] Loss : 1.2401 +2024-05-30 18:23:52,953 - INFO - Test loss : 0.5718 +2024-05-30 18:23:52,953 - INFO - +Epoch 15 +2024-05-30 18:24:03,738 - INFO - [65536/66739] Loss : 1.2389 +2024-05-30 18:24:05,767 - INFO - Test loss : 0.5682 +2024-05-30 18:24:05,767 - INFO - +Epoch 16 +2024-05-30 18:24:18,494 - INFO - [65536/66739] Loss : 1.2375 +2024-05-30 18:24:19,800 - INFO - Test loss : 0.5795 +2024-05-30 18:24:19,800 - INFO - +Epoch 17 +2024-05-30 18:24:32,479 - INFO - [65536/66739] Loss : 1.2364 +2024-05-30 18:24:32,721 - INFO - Test loss : 0.5519 +2024-05-30 18:24:32,721 - INFO - +Epoch 18 +2024-05-30 18:24:45,872 - INFO - [65536/66739] Loss : 1.2324 +2024-05-30 18:24:46,282 - INFO - Test loss : 0.5547 +2024-05-30 18:24:46,282 - INFO - +Epoch 19 +2024-05-30 18:24:58,513 - INFO - [65536/66739] Loss : 1.2294 +2024-05-30 18:25:00,049 - INFO - Test loss : 0.5446 +2024-05-30 18:25:00,050 - INFO - +Epoch 20 +2024-05-30 18:25:11,010 - INFO - [65536/66739] Loss : 1.2278 +2024-05-30 18:25:12,384 - INFO - Test loss : 0.5537 +2024-05-30 18:25:12,399 - INFO - (4449,) +2024-05-30 18:25:12,415 - INFO - Split ID: 0 +2024-05-30 18:25:12,417 - INFO - Top 1 LocEnc (Epoch 20)acc (%): 3.89 +2024-05-30 18:25:12,417 - INFO - Top 3 LocEnc (Epoch 20)acc (%): 9.69 +2024-05-30 18:25:12,417 - INFO - Top 5 LocEnc (Epoch 20)acc (%): 16.12 +2024-05-30 18:25:12,417 - INFO - Top 10 LocEnc (Epoch 20)acc (%): 27.96 +2024-05-30 18:25:12,418 - INFO - +No prior +2024-05-30 18:25:12,420 - INFO - (17798,) +2024-05-30 18:25:12,514 - INFO - Split ID: 0 +2024-05-30 18:25:12,515 - INFO - Top 1 (Epoch 20)acc (%): 50.15 +2024-05-30 18:25:12,515 - INFO - Top 3 (Epoch 20)acc (%): 73.9 +2024-05-30 18:25:12,515 - INFO - Top 5 (Epoch 20)acc (%): 82.45 +2024-05-30 18:25:12,515 - INFO - Top 10 (Epoch 20)acc (%): 91.06 +2024-05-30 18:25:26,718 - INFO - Split ID: 0 +2024-05-30 18:25:26,719 - INFO - Top 1 (Epoch 20)acc (%): 50.26 +2024-05-30 18:25:26,719 - INFO - Top 3 (Epoch 20)acc (%): 74.22 +2024-05-30 18:25:26,719 - INFO - Top 5 (Epoch 20)acc (%): 82.71 +2024-05-30 18:25:26,719 - INFO - Top 10 (Epoch 20)acc (%): 91.12 +2024-05-30 18:25:26,719 - INFO - +Epoch 21 +2024-05-30 18:25:39,823 - INFO - [65536/66739] Loss : 1.2264 +2024-05-30 18:25:40,413 - INFO - Test loss : 0.5288 +2024-05-30 18:25:40,413 - INFO - +Epoch 22 +2024-05-30 18:25:52,477 - INFO - [65536/66739] Loss : 1.2246 +2024-05-30 18:25:52,724 - INFO - Test loss : 0.5497 +2024-05-30 18:25:52,724 - INFO - +Epoch 23 +2024-05-30 18:26:04,474 - INFO - [65536/66739] Loss : 1.2230 +2024-05-30 18:26:05,355 - INFO - Test loss : 0.5401 +2024-05-30 18:26:05,355 - INFO - +Epoch 24 +2024-05-30 18:26:17,719 - INFO - [65536/66739] Loss : 1.2219 +2024-05-30 18:26:19,349 - INFO - Test loss : 0.5482 +2024-05-30 18:26:19,350 - INFO - +Epoch 25 +2024-05-30 18:26:31,898 - INFO - [65536/66739] Loss : 1.2206 +2024-05-30 18:26:32,988 - INFO - Test loss : 0.5427 +2024-05-30 18:26:32,989 - INFO - +Epoch 26 +2024-05-30 18:26:46,665 - INFO - [65536/66739] Loss : 1.2198 +2024-05-30 18:26:48,592 - INFO - Test loss : 0.5336 +2024-05-30 18:26:48,592 - INFO - +Epoch 27 +2024-05-30 18:27:01,374 - INFO - [65536/66739] Loss : 1.2184 +2024-05-30 18:27:01,604 - INFO - Test loss : 0.5425 +2024-05-30 18:27:01,604 - INFO - +Epoch 28 +2024-05-30 18:27:13,504 - INFO - [65536/66739] Loss : 1.2185 +2024-05-30 18:27:13,737 - INFO - Test loss : 0.5329 +2024-05-30 18:27:13,737 - INFO - +Epoch 29 +2024-05-30 18:27:26,448 - INFO - [65536/66739] Loss : 1.2156 +2024-05-30 18:27:26,678 - INFO - Test loss : 0.5433 +2024-05-30 18:27:26,678 - INFO - +Epoch 30 +2024-05-30 18:27:39,209 - INFO - [65536/66739] Loss : 1.2153 +2024-05-30 18:27:39,439 - INFO - Test loss : 0.5239 +2024-05-30 18:27:39,452 - INFO - (4449,) +2024-05-30 18:27:39,470 - INFO - Split ID: 0 +2024-05-30 18:27:39,472 - INFO - Top 1 LocEnc (Epoch 30)acc (%): 3.06 +2024-05-30 18:27:39,472 - INFO - Top 3 LocEnc (Epoch 30)acc (%): 10.5 +2024-05-30 18:27:39,472 - INFO - Top 5 LocEnc (Epoch 30)acc (%): 16.57 +2024-05-30 18:27:39,472 - INFO - Top 10 LocEnc (Epoch 30)acc (%): 27.83 +2024-05-30 18:27:39,473 - INFO - +No prior +2024-05-30 18:27:39,475 - INFO - (17798,) +2024-05-30 18:27:39,577 - INFO - Split ID: 0 +2024-05-30 18:27:39,578 - INFO - Top 1 (Epoch 30)acc (%): 50.15 +2024-05-30 18:27:39,578 - INFO - Top 3 (Epoch 30)acc (%): 73.9 +2024-05-30 18:27:39,578 - INFO - Top 5 (Epoch 30)acc (%): 82.45 +2024-05-30 18:27:39,578 - INFO - Top 10 (Epoch 30)acc (%): 91.06 +2024-05-30 18:27:53,913 - INFO - Split ID: 0 +2024-05-30 18:27:53,913 - INFO - Top 1 (Epoch 30)acc (%): 50.32 +2024-05-30 18:27:53,913 - INFO - Top 3 (Epoch 30)acc (%): 74.35 +2024-05-30 18:27:53,914 - INFO - Top 5 (Epoch 30)acc (%): 82.85 +2024-05-30 18:27:53,914 - INFO - Top 10 (Epoch 30)acc (%): 91.24 +2024-05-30 18:27:53,914 - INFO - +Epoch 31 +2024-05-30 18:28:05,582 - INFO - [65536/66739] Loss : 1.2132 +2024-05-30 18:28:07,148 - INFO - Test loss : 0.5440 +2024-05-30 18:28:07,149 - INFO - +Epoch 32 +2024-05-30 18:28:17,924 - INFO - [65536/66739] Loss : 1.2092 +2024-05-30 18:28:18,503 - INFO - Test loss : 0.5384 +2024-05-30 18:28:18,503 - INFO - +Epoch 33 +2024-05-30 18:28:30,212 - INFO - [65536/66739] Loss : 1.2073 +2024-05-30 18:28:31,100 - INFO - Test loss : 0.5381 +2024-05-30 18:28:31,100 - INFO - +Epoch 34 +2024-05-30 18:28:43,532 - INFO - [65536/66739] Loss : 1.2051 +2024-05-30 18:28:44,899 - INFO - Test loss : 0.5343 +2024-05-30 18:28:44,899 - INFO - +Epoch 35 +2024-05-30 18:28:56,698 - INFO - [65536/66739] Loss : 1.2014 +2024-05-30 18:28:57,513 - INFO - Test loss : 0.5181 +2024-05-30 18:28:57,513 - INFO - +Epoch 36 +2024-05-30 18:29:09,811 - INFO - [65536/66739] Loss : 1.1991 +2024-05-30 18:29:10,088 - INFO - Test loss : 0.5134 +2024-05-30 18:29:10,089 - INFO - +Epoch 37 +2024-05-30 18:29:19,586 - INFO - [65536/66739] Loss : 1.1963 +2024-05-30 18:29:19,820 - INFO - Test loss : 0.5167 +2024-05-30 18:29:19,820 - INFO - +Epoch 38 +2024-05-30 18:29:23,884 - INFO - [65536/66739] Loss : 1.1950 +2024-05-30 18:29:24,194 - INFO - Test loss : 0.5211 +2024-05-30 18:29:24,194 - INFO - +Epoch 39 +2024-05-30 18:29:28,273 - INFO - [65536/66739] Loss : 1.1918 +2024-05-30 18:29:28,504 - INFO - Test loss : 0.5194 +2024-05-30 18:29:28,504 - INFO - +Epoch 40 +2024-05-30 18:29:32,578 - INFO - [65536/66739] Loss : 1.1924 +2024-05-30 18:29:32,812 - INFO - Test loss : 0.5178 +2024-05-30 18:29:32,824 - INFO - (4449,) +2024-05-30 18:29:32,843 - INFO - Split ID: 0 +2024-05-30 18:29:32,844 - INFO - Top 1 LocEnc (Epoch 40)acc (%): 4.83 +2024-05-30 18:29:32,844 - INFO - Top 3 LocEnc (Epoch 40)acc (%): 12.41 +2024-05-30 18:29:32,844 - INFO - Top 5 LocEnc (Epoch 40)acc (%): 18.59 +2024-05-30 18:29:32,845 - INFO - Top 10 LocEnc (Epoch 40)acc (%): 31.71 +2024-05-30 18:29:32,845 - INFO - +No prior +2024-05-30 18:29:32,847 - INFO - (17798,) +2024-05-30 18:29:32,942 - INFO - Split ID: 0 +2024-05-30 18:29:32,942 - INFO - Top 1 (Epoch 40)acc (%): 50.15 +2024-05-30 18:29:32,942 - INFO - Top 3 (Epoch 40)acc (%): 73.9 +2024-05-30 18:29:32,942 - INFO - Top 5 (Epoch 40)acc (%): 82.45 +2024-05-30 18:29:32,942 - INFO - Top 10 (Epoch 40)acc (%): 91.06 +2024-05-30 18:29:43,449 - INFO - Split ID: 0 +2024-05-30 18:29:43,449 - INFO - Top 1 (Epoch 40)acc (%): 50.53 +2024-05-30 18:29:43,449 - INFO - Top 3 (Epoch 40)acc (%): 74.42 +2024-05-30 18:29:43,449 - INFO - Top 5 (Epoch 40)acc (%): 83.03 +2024-05-30 18:29:43,449 - INFO - Top 10 (Epoch 40)acc (%): 91.4 +2024-05-30 18:29:43,450 - INFO - +Epoch 41 +2024-05-30 18:29:47,553 - INFO - [65536/66739] Loss : 1.1921 +2024-05-30 18:29:47,786 - INFO - Test loss : 0.5066 +2024-05-30 18:29:47,787 - INFO - +Epoch 42 +2024-05-30 18:29:53,871 - INFO - [65536/66739] Loss : 1.1900 +2024-05-30 18:29:54,358 - INFO - Test loss : 0.5114 +2024-05-30 18:29:54,358 - INFO - +Epoch 43 +2024-05-30 18:30:02,339 - INFO - [65536/66739] Loss : 1.1886 +2024-05-30 18:30:02,827 - INFO - Test loss : 0.5081 +2024-05-30 18:30:02,827 - INFO - +Epoch 44 +2024-05-30 18:30:13,556 - INFO - [65536/66739] Loss : 1.1889 +2024-05-30 18:30:13,786 - INFO - Test loss : 0.5160 +2024-05-30 18:30:13,787 - INFO - +Epoch 45 +2024-05-30 18:30:25,222 - INFO - [65536/66739] Loss : 1.1862 +2024-05-30 18:30:25,455 - INFO - Test loss : 0.5259 +2024-05-30 18:30:25,456 - INFO - +Epoch 46 +2024-05-30 18:30:37,774 - INFO - [65536/66739] Loss : 1.1843 +2024-05-30 18:30:38,645 - INFO - Test loss : 0.5161 +2024-05-30 18:30:38,645 - INFO - +Epoch 47 +2024-05-30 18:30:50,790 - INFO - [65536/66739] Loss : 1.1829 +2024-05-30 18:30:51,348 - INFO - Test loss : 0.5081 +2024-05-30 18:30:51,348 - INFO - +Epoch 48 +2024-05-30 18:31:04,069 - INFO - [65536/66739] Loss : 1.1832 +2024-05-30 18:31:04,534 - INFO - Test loss : 0.5120 +2024-05-30 18:31:04,534 - INFO - +Epoch 49 +2024-05-30 18:31:17,197 - INFO - [65536/66739] Loss : 1.1817 +2024-05-30 18:31:17,718 - INFO - Test loss : 0.5112 +2024-05-30 18:31:17,718 - INFO - +Epoch 50 +2024-05-30 18:31:29,846 - INFO - [65536/66739] Loss : 1.1809 +2024-05-30 18:31:30,267 - INFO - Test loss : 0.5082 +2024-05-30 18:31:30,283 - INFO - (4449,) +2024-05-30 18:31:30,302 - INFO - Split ID: 0 +2024-05-30 18:31:30,304 - INFO - Top 1 LocEnc (Epoch 50)acc (%): 6.61 +2024-05-30 18:31:30,304 - INFO - Top 3 LocEnc (Epoch 50)acc (%): 13.94 +2024-05-30 18:31:30,305 - INFO - Top 5 LocEnc (Epoch 50)acc (%): 19.87 +2024-05-30 18:31:30,305 - INFO - Top 10 LocEnc (Epoch 50)acc (%): 33.2 +2024-05-30 18:31:30,305 - INFO - +No prior +2024-05-30 18:31:30,308 - INFO - (17798,) +2024-05-30 18:31:30,400 - INFO - Split ID: 0 +2024-05-30 18:31:30,400 - INFO - Top 1 (Epoch 50)acc (%): 50.15 +2024-05-30 18:31:30,400 - INFO - Top 3 (Epoch 50)acc (%): 73.9 +2024-05-30 18:31:30,400 - INFO - Top 5 (Epoch 50)acc (%): 82.45 +2024-05-30 18:31:30,400 - INFO - Top 10 (Epoch 50)acc (%): 91.06 +2024-05-30 18:31:44,580 - INFO - Split ID: 0 +2024-05-30 18:31:44,580 - INFO - Top 1 (Epoch 50)acc (%): 50.56 +2024-05-30 18:31:44,580 - INFO - Top 3 (Epoch 50)acc (%): 74.33 +2024-05-30 18:31:44,581 - INFO - Top 5 (Epoch 50)acc (%): 83.22 +2024-05-30 18:31:44,581 - INFO - Top 10 (Epoch 50)acc (%): 91.47 +2024-05-30 18:31:44,581 - INFO - +Epoch 51 +2024-05-30 18:31:56,077 - INFO - [65536/66739] Loss : 1.1806 +2024-05-30 18:31:56,905 - INFO - Test loss : 0.5109 +2024-05-30 18:31:56,905 - INFO - +Epoch 52 +2024-05-30 18:32:08,598 - INFO - [65536/66739] Loss : 1.1787 +2024-05-30 18:32:08,980 - INFO - Test loss : 0.5065 +2024-05-30 18:32:08,980 - INFO - +Epoch 53 +2024-05-30 18:32:21,686 - INFO - [65536/66739] Loss : 1.1780 +2024-05-30 18:32:22,107 - INFO - Test loss : 0.5122 +2024-05-30 18:32:22,107 - INFO - +Epoch 54 +2024-05-30 18:32:37,615 - INFO - [65536/66739] Loss : 1.1774 +2024-05-30 18:32:37,855 - INFO - Test loss : 0.5158 +2024-05-30 18:32:37,855 - INFO - +Epoch 55 +2024-05-30 18:32:50,134 - INFO - [65536/66739] Loss : 1.1780 +2024-05-30 18:32:51,376 - INFO - Test loss : 0.5073 +2024-05-30 18:32:51,376 - INFO - +Epoch 56 +2024-05-30 18:33:03,360 - INFO - [65536/66739] Loss : 1.1755 +2024-05-30 18:33:05,015 - INFO - Test loss : 0.5091 +2024-05-30 18:33:05,015 - INFO - +Epoch 57 +2024-05-30 18:33:16,296 - INFO - [65536/66739] Loss : 1.1762 +2024-05-30 18:33:16,716 - INFO - Test loss : 0.5200 +2024-05-30 18:33:16,716 - INFO - +Epoch 58 +2024-05-30 18:33:29,893 - INFO - [65536/66739] Loss : 1.1758 +2024-05-30 18:33:30,125 - INFO - Test loss : 0.4990 +2024-05-30 18:33:30,125 - INFO - +Epoch 59 +2024-05-30 18:33:41,987 - INFO - [65536/66739] Loss : 1.1745 +2024-05-30 18:33:42,324 - INFO - Test loss : 0.5139 +2024-05-30 18:33:42,325 - INFO - +Epoch 60 +2024-05-30 18:33:55,070 - INFO - [65536/66739] Loss : 1.1745 +2024-05-30 18:33:56,371 - INFO - Test loss : 0.5064 +2024-05-30 18:33:56,385 - INFO - (4449,) +2024-05-30 18:33:56,405 - INFO - Split ID: 0 +2024-05-30 18:33:56,406 - INFO - Top 1 LocEnc (Epoch 60)acc (%): 5.96 +2024-05-30 18:33:56,407 - INFO - Top 3 LocEnc (Epoch 60)acc (%): 12.63 +2024-05-30 18:33:56,407 - INFO - Top 5 LocEnc (Epoch 60)acc (%): 18.75 +2024-05-30 18:33:56,407 - INFO - Top 10 LocEnc (Epoch 60)acc (%): 33.13 +2024-05-30 18:33:56,408 - INFO - +No prior +2024-05-30 18:33:56,410 - INFO - (17798,) +2024-05-30 18:33:56,507 - INFO - Split ID: 0 +2024-05-30 18:33:56,508 - INFO - Top 1 (Epoch 60)acc (%): 50.15 +2024-05-30 18:33:56,508 - INFO - Top 3 (Epoch 60)acc (%): 73.9 +2024-05-30 18:33:56,508 - INFO - Top 5 (Epoch 60)acc (%): 82.45 +2024-05-30 18:33:56,508 - INFO - Top 10 (Epoch 60)acc (%): 91.06 +2024-05-30 18:34:10,629 - INFO - Split ID: 0 +2024-05-30 18:34:10,629 - INFO - Top 1 (Epoch 60)acc (%): 50.63 +2024-05-30 18:34:10,629 - INFO - Top 3 (Epoch 60)acc (%): 74.4 +2024-05-30 18:34:10,630 - INFO - Top 5 (Epoch 60)acc (%): 83.18 +2024-05-30 18:34:10,630 - INFO - Top 10 (Epoch 60)acc (%): 91.47 +2024-05-30 18:34:10,630 - INFO - +Epoch 61 +2024-05-30 18:34:21,649 - INFO - [65536/66739] Loss : 1.1734 +2024-05-30 18:34:22,359 - INFO - Test loss : 0.5176 +2024-05-30 18:34:22,360 - INFO - +Epoch 62 +2024-05-30 18:34:35,282 - INFO - [65536/66739] Loss : 1.1723 +2024-05-30 18:34:35,525 - INFO - Test loss : 0.5269 +2024-05-30 18:34:35,526 - INFO - +Epoch 63 +2024-05-30 18:34:48,463 - INFO - [65536/66739] Loss : 1.1724 +2024-05-30 18:34:48,693 - INFO - Test loss : 0.5044 +2024-05-30 18:34:48,693 - INFO - +Epoch 64 +2024-05-30 18:35:02,802 - INFO - [65536/66739] Loss : 1.1712 +2024-05-30 18:35:04,348 - INFO - Test loss : 0.5375 +2024-05-30 18:35:04,348 - INFO - +Epoch 65 +2024-05-30 18:35:17,336 - INFO - [65536/66739] Loss : 1.1705 +2024-05-30 18:35:17,944 - INFO - Test loss : 0.5252 +2024-05-30 18:35:17,945 - INFO - +Epoch 66 +2024-05-30 18:35:29,835 - INFO - [65536/66739] Loss : 1.1693 +2024-05-30 18:35:30,504 - INFO - Test loss : 0.5099 +2024-05-30 18:35:30,504 - INFO - +Epoch 67 +2024-05-30 18:35:42,755 - INFO - [65536/66739] Loss : 1.1692 +2024-05-30 18:35:42,990 - INFO - Test loss : 0.5256 +2024-05-30 18:35:42,991 - INFO - +Epoch 68 +2024-05-30 18:35:55,773 - INFO - [65536/66739] Loss : 1.1681 +2024-05-30 18:35:56,002 - INFO - Test loss : 0.5147 +2024-05-30 18:35:56,002 - INFO - +Epoch 69 +2024-05-30 18:36:08,274 - INFO - [65536/66739] Loss : 1.1689 +2024-05-30 18:36:08,504 - INFO - Test loss : 0.5008 +2024-05-30 18:36:08,504 - INFO - +Epoch 70 +2024-05-30 18:36:21,109 - INFO - [65536/66739] Loss : 1.1677 +2024-05-30 18:36:21,980 - INFO - Test loss : 0.5127 +2024-05-30 18:36:21,994 - INFO - (4449,) +2024-05-30 18:36:22,014 - INFO - Split ID: 0 +2024-05-30 18:36:22,015 - INFO - Top 1 LocEnc (Epoch 70)acc (%): 5.87 +2024-05-30 18:36:22,016 - INFO - Top 3 LocEnc (Epoch 70)acc (%): 12.99 +2024-05-30 18:36:22,016 - INFO - Top 5 LocEnc (Epoch 70)acc (%): 18.36 +2024-05-30 18:36:22,016 - INFO - Top 10 LocEnc (Epoch 70)acc (%): 33.18 +2024-05-30 18:36:22,016 - INFO - +No prior +2024-05-30 18:36:22,019 - INFO - (17798,) +2024-05-30 18:36:22,112 - INFO - Split ID: 0 +2024-05-30 18:36:22,112 - INFO - Top 1 (Epoch 70)acc (%): 50.15 +2024-05-30 18:36:22,112 - INFO - Top 3 (Epoch 70)acc (%): 73.9 +2024-05-30 18:36:22,112 - INFO - Top 5 (Epoch 70)acc (%): 82.45 +2024-05-30 18:36:22,112 - INFO - Top 10 (Epoch 70)acc (%): 91.06 +2024-05-30 18:36:36,300 - INFO - Split ID: 0 +2024-05-30 18:36:36,300 - INFO - Top 1 (Epoch 70)acc (%): 50.64 +2024-05-30 18:36:36,300 - INFO - Top 3 (Epoch 70)acc (%): 74.38 +2024-05-30 18:36:36,300 - INFO - Top 5 (Epoch 70)acc (%): 83.33 +2024-05-30 18:36:36,300 - INFO - Top 10 (Epoch 70)acc (%): 91.53 +2024-05-30 18:36:36,300 - INFO - +Epoch 71 +2024-05-30 18:36:47,651 - INFO - [65536/66739] Loss : 1.1679 +2024-05-30 18:36:48,202 - INFO - Test loss : 0.5065 +2024-05-30 18:36:48,203 - INFO - +Epoch 72 +2024-05-30 18:37:00,644 - INFO - [65536/66739] Loss : 1.1679 +2024-05-30 18:37:00,970 - INFO - Test loss : 0.5222 +2024-05-30 18:37:00,970 - INFO - +Epoch 73 +2024-05-30 18:37:13,759 - INFO - [65536/66739] Loss : 1.1660 +2024-05-30 18:37:14,016 - INFO - Test loss : 0.5239 +2024-05-30 18:37:14,016 - INFO - +Epoch 74 +2024-05-30 18:37:26,127 - INFO - [65536/66739] Loss : 1.1662 +2024-05-30 18:37:26,378 - INFO - Test loss : 0.5104 +2024-05-30 18:37:26,378 - INFO - +Epoch 75 +2024-05-30 18:37:41,893 - INFO - [65536/66739] Loss : 1.1646 +2024-05-30 18:37:42,124 - INFO - Test loss : 0.5168 +2024-05-30 18:37:42,125 - INFO - +Epoch 76 +2024-05-30 18:37:54,132 - INFO - [65536/66739] Loss : 1.1639 +2024-05-30 18:37:54,372 - INFO - Test loss : 0.5189 +2024-05-30 18:37:54,372 - INFO - +Epoch 77 +2024-05-30 18:38:06,089 - INFO - [65536/66739] Loss : 1.1642 +2024-05-30 18:38:06,509 - INFO - Test loss : 0.5259 +2024-05-30 18:38:06,509 - INFO - +Epoch 78 +2024-05-30 18:38:19,223 - INFO - [65536/66739] Loss : 1.1632 +2024-05-30 18:38:20,588 - INFO - Test loss : 0.5236 +2024-05-30 18:38:20,588 - INFO - +Epoch 79 +2024-05-30 18:38:32,030 - INFO - [65536/66739] Loss : 1.1631 +2024-05-30 18:38:33,389 - INFO - Test loss : 0.5168 +2024-05-30 18:38:33,389 - INFO - +Epoch 80 +2024-05-30 18:38:45,324 - INFO - [65536/66739] Loss : 1.1620 +2024-05-30 18:38:46,187 - INFO - Test loss : 0.5341 +2024-05-30 18:38:46,206 - INFO - (4449,) +2024-05-30 18:38:46,227 - INFO - Split ID: 0 +2024-05-30 18:38:46,229 - INFO - Top 1 LocEnc (Epoch 80)acc (%): 5.44 +2024-05-30 18:38:46,229 - INFO - Top 3 LocEnc (Epoch 80)acc (%): 12.2 +2024-05-30 18:38:46,229 - INFO - Top 5 LocEnc (Epoch 80)acc (%): 18.45 +2024-05-30 18:38:46,229 - INFO - Top 10 LocEnc (Epoch 80)acc (%): 31.0 +2024-05-30 18:38:46,230 - INFO - +No prior +2024-05-30 18:38:46,233 - INFO - (17798,) +2024-05-30 18:38:46,334 - INFO - Split ID: 0 +2024-05-30 18:38:46,334 - INFO - Top 1 (Epoch 80)acc (%): 50.15 +2024-05-30 18:38:46,334 - INFO - Top 3 (Epoch 80)acc (%): 73.9 +2024-05-30 18:38:46,335 - INFO - Top 5 (Epoch 80)acc (%): 82.45 +2024-05-30 18:38:46,335 - INFO - Top 10 (Epoch 80)acc (%): 91.06 +2024-05-30 18:39:00,565 - INFO - Split ID: 0 +2024-05-30 18:39:00,565 - INFO - Top 1 (Epoch 80)acc (%): 50.57 +2024-05-30 18:39:00,565 - INFO - Top 3 (Epoch 80)acc (%): 74.44 +2024-05-30 18:39:00,565 - INFO - Top 5 (Epoch 80)acc (%): 83.17 +2024-05-30 18:39:00,565 - INFO - Top 10 (Epoch 80)acc (%): 91.47 +2024-05-30 18:39:00,566 - INFO - +Epoch 81 +2024-05-30 18:39:12,877 - INFO - [65536/66739] Loss : 1.1622 +2024-05-30 18:39:13,303 - INFO - Test loss : 0.5133 +2024-05-30 18:39:13,303 - INFO - +Epoch 82 +2024-05-30 18:39:25,683 - INFO - [65536/66739] Loss : 1.1625 +2024-05-30 18:39:26,203 - INFO - Test loss : 0.5168 +2024-05-30 18:39:26,203 - INFO - +Epoch 83 +2024-05-30 18:39:39,756 - INFO - [65536/66739] Loss : 1.1612 +2024-05-30 18:39:40,005 - INFO - Test loss : 0.5365 +2024-05-30 18:39:40,005 - INFO - +Epoch 84 +2024-05-30 18:39:52,419 - INFO - [65536/66739] Loss : 1.1614 +2024-05-30 18:39:52,651 - INFO - Test loss : 0.5120 +2024-05-30 18:39:52,651 - INFO - +Epoch 85 +2024-05-30 18:40:06,171 - INFO - [65536/66739] Loss : 1.1610 +2024-05-30 18:40:07,596 - INFO - Test loss : 0.5203 +2024-05-30 18:40:07,596 - INFO - +Epoch 86 +2024-05-30 18:40:19,244 - INFO - [65536/66739] Loss : 1.1603 +2024-05-30 18:40:20,500 - INFO - Test loss : 0.5279 +2024-05-30 18:40:20,501 - INFO - +Epoch 87 +2024-05-30 18:40:32,125 - INFO - [65536/66739] Loss : 1.1600 +2024-05-30 18:40:32,379 - INFO - Test loss : 0.5282 +2024-05-30 18:40:32,380 - INFO - +Epoch 88 +2024-05-30 18:40:45,728 - INFO - [65536/66739] Loss : 1.1591 +2024-05-30 18:40:45,967 - INFO - Test loss : 0.5439 +2024-05-30 18:40:45,968 - INFO - +Epoch 89 +2024-05-30 18:40:57,660 - INFO - [65536/66739] Loss : 1.1598 +2024-05-30 18:40:58,337 - INFO - Test loss : 0.5119 +2024-05-30 18:40:58,337 - INFO - +Epoch 90 +2024-05-30 18:41:10,289 - INFO - [65536/66739] Loss : 1.1587 +2024-05-30 18:41:10,524 - INFO - Test loss : 0.5379 +2024-05-30 18:41:10,540 - INFO - (4449,) +2024-05-30 18:41:10,560 - INFO - Split ID: 0 +2024-05-30 18:41:10,562 - INFO - Top 1 LocEnc (Epoch 90)acc (%): 5.03 +2024-05-30 18:41:10,562 - INFO - Top 3 LocEnc (Epoch 90)acc (%): 12.23 +2024-05-30 18:41:10,562 - INFO - Top 5 LocEnc (Epoch 90)acc (%): 18.05 +2024-05-30 18:41:10,562 - INFO - Top 10 LocEnc (Epoch 90)acc (%): 31.51 +2024-05-30 18:41:10,563 - INFO - +No prior +2024-05-30 18:41:10,565 - INFO - (17798,) +2024-05-30 18:41:10,662 - INFO - Split ID: 0 +2024-05-30 18:41:10,662 - INFO - Top 1 (Epoch 90)acc (%): 50.15 +2024-05-30 18:41:10,662 - INFO - Top 3 (Epoch 90)acc (%): 73.9 +2024-05-30 18:41:10,662 - INFO - Top 5 (Epoch 90)acc (%): 82.45 +2024-05-30 18:41:10,662 - INFO - Top 10 (Epoch 90)acc (%): 91.06 +2024-05-30 18:41:24,966 - INFO - Split ID: 0 +2024-05-30 18:41:24,967 - INFO - Top 1 (Epoch 90)acc (%): 50.58 +2024-05-30 18:41:24,968 - INFO - Top 3 (Epoch 90)acc (%): 74.46 +2024-05-30 18:41:24,968 - INFO - Top 5 (Epoch 90)acc (%): 83.15 +2024-05-30 18:41:24,969 - INFO - Top 10 (Epoch 90)acc (%): 91.47 +2024-05-30 18:41:24,969 - INFO - +Epoch 91 +2024-05-30 18:41:35,638 - INFO - [65536/66739] Loss : 1.1583 +2024-05-30 18:41:36,722 - INFO - Test loss : 0.5219 +2024-05-30 18:41:36,722 - INFO - +Epoch 92 +2024-05-30 18:41:48,540 - INFO - [65536/66739] Loss : 1.1587 +2024-05-30 18:41:48,771 - INFO - Test loss : 0.5105 +2024-05-30 18:41:48,772 - INFO - +Epoch 93 +2024-05-30 18:42:00,467 - INFO - [65536/66739] Loss : 1.1582 +2024-05-30 18:42:00,824 - INFO - Test loss : 0.5249 +2024-05-30 18:42:00,824 - INFO - +Epoch 94 +2024-05-30 18:42:13,405 - INFO - [65536/66739] Loss : 1.1578 +2024-05-30 18:42:14,833 - INFO - Test loss : 0.5487 +2024-05-30 18:42:14,833 - INFO - +Epoch 95 +2024-05-30 18:42:27,349 - INFO - [65536/66739] Loss : 1.1587 +2024-05-30 18:42:27,697 - INFO - Test loss : 0.5249 +2024-05-30 18:42:27,697 - INFO - +Epoch 96 +2024-05-30 18:42:43,479 - INFO - [65536/66739] Loss : 1.1582 +2024-05-30 18:42:43,709 - INFO - Test loss : 0.5259 +2024-05-30 18:42:43,709 - INFO - +Epoch 97 +2024-05-30 18:42:55,623 - INFO - [65536/66739] Loss : 1.1571 +2024-05-30 18:42:55,852 - INFO - Test loss : 0.5216 +2024-05-30 18:42:55,852 - INFO - +Epoch 98 +2024-05-30 18:43:08,066 - INFO - [65536/66739] Loss : 1.1567 +2024-05-30 18:43:09,736 - INFO - Test loss : 0.5230 +2024-05-30 18:43:09,736 - INFO - +Epoch 99 +2024-05-30 18:43:21,706 - INFO - [65536/66739] Loss : 1.1573 +2024-05-30 18:43:22,679 - INFO - Test loss : 0.5199 +2024-05-30 18:43:22,679 - INFO - Saving output model to ../models/sphere2vec_dfs/model_yfcc_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-30 18:43:22,697 - INFO - Saving output model to ../models/sphere2vec_dfs/model_yfcc_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-30 18:43:22,743 - INFO - +No prior +2024-05-30 18:43:22,745 - INFO - (17798,) +2024-05-30 18:43:22,849 - INFO - Split ID: 0 +2024-05-30 18:43:22,849 - INFO - Top 1 acc (%): 50.15 +2024-05-30 18:43:22,849 - INFO - Top 3 acc (%): 73.9 +2024-05-30 18:43:22,849 - INFO - Top 5 acc (%): 82.45 +2024-05-30 18:43:22,849 - INFO - Top 10 acc (%): 91.06 +2024-05-30 18:43:36,967 - INFO - Split ID: 0 +2024-05-30 18:43:36,967 - INFO - Top 1 acc (%): 50.65 +2024-05-30 18:43:36,968 - INFO - Top 3 acc (%): 74.51 +2024-05-30 18:43:36,968 - INFO - Top 5 acc (%): 83.24 +2024-05-30 18:43:36,968 - INFO - Top 10 acc (%): 91.53 +2024-05-30 18:43:36,968 - INFO - +Sphere2Vec-dfs +2024-05-30 18:43:36,968 - INFO - Model : model_yfcc_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_512_BATCH4096_leakyrelu.pth.tar +2024-05-30 18:43:37,002 - INFO - (4449,) +2024-05-30 18:43:37,025 - INFO - Split ID: 0 +2024-05-30 18:43:37,026 - INFO - Top 1 LocEnc acc (%): 5.64 +2024-05-30 18:43:37,026 - INFO - Top 3 LocEnc acc (%): 12.79 +2024-05-30 18:43:37,027 - INFO - Top 5 LocEnc acc (%): 18.95 +2024-05-30 18:43:37,027 - INFO - Top 10 LocEnc acc (%): 31.98 diff --git a/pre_trained_models/sphere2vec_dfs/model_yfcc_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_512_BATCH4096_leakyrelu.pth.tar b/pre_trained_models/sphere2vec_dfs/model_yfcc_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_512_BATCH4096_leakyrelu.pth.tar new file mode 100755 index 00000000..ffba59a9 Binary files /dev/null and b/pre_trained_models/sphere2vec_dfs/model_yfcc_Sphere2Vec-dfs_inception_v3_0.0100_8_0.0050000_1.000_3_512_BATCH4096_leakyrelu.pth.tar differ diff --git a/source/3D Location Encoder/Spherical-Harmonics.md b/source/3D Location Encoder/Spherical-Harmonics.md new file mode 100644 index 00000000..a6cd9d24 --- /dev/null +++ b/source/3D Location Encoder/Spherical-Harmonics.md @@ -0,0 +1,116 @@ +# SphericalHarmonicsSpatialRelationLocationEncoder + +## Overview +The `SphericalHarmonicsSpatialRelationLocationEncoder` is designed to encode spatial relationships using spherical harmonics, which are particularly useful for modeling functions on the sphere. This encoder is complemented by the `SphericalHarmonicsSpatialRelationPositionEncoder`, which transforms geographical coordinates into a three-dimensional space and applies spherical harmonics for positional encoding. + +## Features +- **Position Encoding (`self.position_encoder`)**: Utilizes `SphericalHarmonicsSpatialRelationPositionEncoder` for converting longitude and latitude into 3D coordinates and encoding them using spherical harmonics. +- **Feed-Forward Neural Network (`self.ffn`)**: Processes the spherical harmonics encoded data through a multi-layer feed-forward neural network to generate final spatial embeddings. + +## Configuration Parameters +- **spa_embed_dim**: Dimensionality of the output spatial embeddings. +- **coord_dim**: Dimensionality of the coordinate space, typically 2 for geographical coordinates. +- **legendre_poly_num**: Number of Legendre polynomials used in the spherical harmonics computation. +- **device**: Computation device used (e.g., 'cuda' for GPU acceleration). +- **ffn_act**: Activation function for the neural network layers. +- **ffn_num_hidden_layers**: Number of hidden layers in the neural network. +- **ffn_dropout_rate**: Dropout rate to prevent overfitting during training. +- **ffn_hidden_dim**: Dimension of each hidden layer within the network. +- **ffn_use_layernormalize**: Whether to use layer normalization. +- **ffn_skip_connection**: Whether to include skip connections within the network layers. +- **ffn_context_str**: Context string for debugging and detailed logging within the network. + +## Methods +### forward(coords) +- **Purpose**: Processes input coordinates through the encoder to produce spatial embeddings. +- **Parameters**: + - **coords** (List or np.ndarray): Coordinates to process, formatted as (batch_size, num_context_pt, coord_dim). +- **Returns**: + - **sprenc** (Tensor): The final spatial relation embeddings, shaped (batch_size, num_context_pt, spa_embed_dim). + +> ## SphericalHarmonicsSpatialRelationPositionEncoder + +### Overview +This position encoder transforms geographic coordinates (longitude and latitude) into a 3D space using spherical coordinates, and then applies spherical harmonics to produce a high-dimensional representation of these positions. + +### Features +- **3D Coordinate Conversion**: Converts longitude and latitude into 3D spherical coordinates. +- **Spherical Harmonics Encoding**: Applies spherical harmonics to encode the positions in a high-dimensional space, capturing complex spatial relationships. +### Formula +The encoder utilizes spherical harmonics to encode spatial data, transforming coordinates (longitude and latitude) into a three-dimensional spherical coordinate system, and then applying spherical harmonics to these coordinates. + +#### 1. Conversion to Spherical Coordinates +Given longitude ( $\phi$ ) and latitude ( $\theta$ ), the coordinates are converted into spherical coordinates. Each point on the surface of the sphere is expressed as: +- $x = \cos(\phi) \sin(\theta)$ +- $y = \sin(\phi) \sin(\theta)$ +- $z = \cos(\theta)$ + +Where: +- $\phi$ is longitude in radians. +- $\theta$ is latitude in radians. + +#### 2. Spherical Harmonics +Spherical harmonics are orthogonal functions defined on the sphere, used to generate a positional encoding. The function $Y_l^m(\theta, \phi)$ for a degree $l$ and order $m$ is given by: + +$$ +Y_l^m(\theta, \phi) = P_l^m(\cos(\theta)) e^{im\phi} +$$ + +Where: +- $P_l^m$ are the associated Legendre polynomials. +- $e^{im\phi}$ is the complex exponential function. + +#### 3. Encoding Formula +The position encoding using spherical harmonics is computed as a sum of these functions across a range of degrees and orders, generally formulated as: + +$$ +\text{Enc}(x, y, z) = \sum_{l=0}^{L} \sum_{m=-l}^{l} c_{lm} Y_l^m(\theta, \phi) +$$ + +Where: +- $c_{lm}$ are coefficients, which may be learned or predefined. +- $L$ is the maximum degree of spherical harmonics used, determined by the `legendre_poly_num`. + +These embeddings are then processed through a feed-forward neural network, incorporating linear transformations and non-linear activations to produce the final spatial relation embeddings suitable for machine learning applications. + +### Configuration Parameters +- **coord_dim**: Dimensionality of the input space, typically 2 for (longitude, latitude). +- **legendre_poly_num**: Number of Legendre polynomials used for spherical harmonics. +- **device**: Specifies the computation device (e.g., 'cuda'). + +### Methods + +#### make_output_embeds(coords) +- **Description**: Converts geographical coordinates into embeddings using spherical harmonics. +- **Parameters**: + - **coords**: Coordinates in the format (batch_size, num_context_pt, coord_dim). +- **Returns**: + - High-dimensional embeddings representing the input data in terms of spherical harmonics. + +#### forward(coords) +- **Description**: Encodes a list of geographic coordinates into their spherical harmonics embeddings. +- **Parameters**: + - **coords**: A list of coordinates. +- **Returns**: + - Tensor of spatial relation embeddings shaped as (batch_size, num_context_pt, pos_enc_output_dim). + +## Usage Example +```python +# Initialize the encoder +encoder = SphericalHarmonicsSpatialRelationLocationEncoder( + spa_embed_dim=64, + coord_dim=2, + legendre_poly_num=8, + device="cuda", + ffn_act="relu", + ffn_num_hidden_layers=1, + ffn_dropout_rate=0.5, + ffn_hidden_dim=256, + ffn_use_layernormalize=True, + ffn_skip_connection=True, + ffn_context_str="SphericalHarmonicsSpatialRelationEncoder" +) + +# Example coordinate data +coords = np.array([[34.0522, -118.2437], [40.7128, -74.0060]]) +embeddings = encoder.forward(coords) diff --git a/source/index.rst b/source/index.rst index 7a9818f4..382c8c7e 100644 --- a/source/index.rst +++ b/source/index.rst @@ -7,7 +7,7 @@ Welcome to TorchSpatial's documentation! ======================================== .. image:: /images/hi.jpg -Here is the **v1.0** +Here is the **v0.1.0** .. toctree:: :maxdepth: 2