Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Reshape throw bus error (Core Dumped) on ardiffnlp/twitter-roberta-base-sentiment #27817

Open
3 tasks done
dangokuson opened this issue Nov 29, 2024 · 4 comments
Open
3 tasks done
Assignees
Labels
bug Something isn't working support_request

Comments

@dangokuson
Copy link

OpenVINO Version

2023.3.0

Operating System

Other (Please specify in description)

Device used for inference

GPU

Framework

PyTorch

Model used

cardiffnlp/twitter-roberta-base-sentiment

Issue description

Reshape throw bus error (Core Dumped).

Step-by-step reproduction

I am using OpenVINO version 2023.3.0.
However, it throws an exception while I am trying to save the model:

def save_model_as_ir(model: Model, network_file_name: str, output_dir: Path):
    result_path = output_dir / network_file_name
    output_dir.mkdir(exist_ok=True)
    xml_path = f'{result_path}.xml'.encode('UTF-8')
    bin_path = f'{result_path}.bin'.encode('UTF-8')

    pass_manager = Manager()
    pass_manager.register_pass(Serialize(xml_path, bin_path))
    pass_manager.run_passes(model)

But, it works as well if I change OpenVINO to version 2022.3.0

Relevant log output

No response

Issue submission checklist

  • I'm reporting an issue. It's not a question.
  • I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
  • There is reproducer code and related data files such as images, videos, models, etc.
@dangokuson dangokuson added bug Something isn't working support_request labels Nov 29, 2024
@ilya-lavrenov ilya-lavrenov changed the title [Bug]: [Bug]: Reshape throw bus error (Core Dumped) on ardiffnlp/twitter-roberta-base-sentiment Nov 29, 2024
@dangokuson
Copy link
Author

Are there any solutions to solve this problem? The stranger is, it works as well if I change OpenVINO to version 2022.3.0

@praasz
Copy link
Contributor

praasz commented Dec 4, 2024

xml_path = f'{result_path}.xml'.encode('UTF-8')
bin_path = f'{result_path}.bin'.encode('UTF-8')

Thank you for reporting the issue.
Could you provide full python example how model is read and write to OpenVINO IR for both versions?
It would be easier to same issue.
I've tried write mentioned model to xml and it worked for me so probably I'm doing something differently

@dangokuson
Copy link
Author

Hi @praasz, thanks for your reply.

This is the command line I used to reshape mode

python3 /Users/ubuntu/workspace/projects/AI_Research/scripts/reshape.py --config /Users/ubuntu/workspace/projects/AI_Research/data/reshape_model_artifacts/165/scripts/reshape_model.config.json

The scripts how model is read and write is here

def parse_arguments():
    parser = argparse.ArgumentParser()
    parser.add_argument('--config', required=True, type=Path)
    return parser.parse_args()


def load_config(config_path: Path) -> dict:
    with config_path.open() as config_file:
        return json.load(config_file)


def construct_shape_configuration(inputs_configuration: List[Dict]) -> Dict[str, PartialShape]:
    return {
        input_configuration['index']: PartialShape(input_configuration['shape'])
        for input_configuration in inputs_configuration
    }


def reshape_model(xml_path: Path, bin_path: Path, shape_configuration: Dict[str, List[int]]) -> Model:
    core = Core()

    model: Model = core.read_model(
        model=str(xml_path),
        weights=str(bin_path),
    )

    node_output_per_shape = {}
    for input_index, input_configuration in shape_configuration.items():
        node = model.inputs[input_index].node
        node_output_per_shape[node.output(0)] = input_configuration

    model.reshape(node_output_per_shape)

    return model


def save_model_as_ir(model: Model, network_file_name: str, output_dir: Path):
    result_path = output_dir / network_file_name
    output_dir.mkdir(exist_ok=True)
    xml_path = f'{result_path}.xml'.encode('UTF-8')
    bin_path = f'{result_path}.bin'.encode('UTF-8')

    pass_manager = Manager()
    pass_manager.register_pass(Serialize(xml_path, bin_path))
    pass_manager.run_passes(model)


# TODO: Looks like the class to report progress can be reused in another tools, we need to generalize it and make shared
class _ProgressReporter:
    def __init__(self, log_header: str, total_steps: int, progress_step: int = 1):
        self._log_header = log_header
        self._prev_progress = 0
        self._total_steps = total_steps
        self._current_step = 0
        self._progress_step = progress_step

    def _log_progress(self):
        print(f'{self._log_header}: {self.prev_progress}%')

    def next_step(self):
        self._current_step += 1
        progress = int(self._current_step * (100 / self._total_steps))
        if progress - self._prev_progress >= self._progress_step:
            self.prev_progress = progress
            self._log_progress()


def main(config: dict):
    progress_reporter = _ProgressReporter(log_header='[RESHAPE TOOL]',
                                          total_steps=4)
    progress_reporter.next_step()

    xml_path = Path(config['xml_path'])
    bin_path = Path(config['bin_path'])
    dump_reshaped_model = config['dump_reshaped_model']

    inputs_shape_configuration = config['inputs_shape_configuration']

    shape_configuration = construct_shape_configuration(inputs_shape_configuration)

    progress_reporter.next_step()

    reshaped_function = reshape_model(xml_path, bin_path, shape_configuration)
    progress_reporter.next_step()

    if dump_reshaped_model:
        output_dir = Path(config['output_dir'])

        old_model_content = ElementTree.parse(xml_path)
        metadata = old_model_content.find('./meta_data')

        save_model_as_ir(reshaped_function, xml_path.stem, output_dir)

        if metadata:
            new_model_content = ElementTree.parse(output_dir / xml_path.name)
            new_model_content.getroot().append(metadata)
            new_model_content.write(output_dir / xml_path.name)

    progress_reporter.next_step()


if __name__ == '__main__':
    ARGUMENTS = parse_arguments()
    CONFIGURATION = load_config(ARGUMENTS.config)
    try:
        main(CONFIGURATION)
    except RuntimeError as e:
        print(f'\n During the model reshape process, OpenVINO runtime error occurred:\n {str(e)}')
        exit(1)

And this is json

{
    "xml_path": "/Users/ubuntu/workspace/projects/AI_Research/data/models/169/original/cardiffnlp_twitter-roberta-base-sentiment-latest.xml",
    "bin_path": "/Users/ubuntu/workspace/projects/AI_Research/data/models/169/original/cardiffnlp_twitter-roberta-base-sentiment-latest.bin",
    "inputs_shape_configuration": [
        {
            "name": "input_ids",
            "index": 0,
            "shape": [
                1,
                128
            ]
        },
        {
            "name": "attention_mask",
            "index": 1,
            "shape": [
                1,
                128
            ]
        }
    ],
    "dump_reshaped_model": true,
    "output_dir": "/Users/ubuntu/workspace/projects/AI_Research/data/models/169/original"
}

As I explained above, the OpenVINO I used was 2023.3.0 but it works as well when I change to 2022.3.0

@dangokuson
Copy link
Author

@praasz How can I solve this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working support_request
Projects
None yet
Development

No branches or pull requests

2 participants