Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rust-bert rrrors on Debian 12 with pre compiled libtorch #462

Open
greensegfault opened this issue Jul 21, 2024 · 0 comments
Open

Rust-bert rrrors on Debian 12 with pre compiled libtorch #462

greensegfault opened this issue Jul 21, 2024 · 0 comments

Comments

@greensegfault
Copy link

I get compile time errors on Cargo run, my system is Debian.

Cloning rust-bert and simply running rust-bert > cargo run --example sentence_embeddings works, I get the embedding printed.

When I run the simple program below, I get compile errors.
I have tried with different versions of precompiled libtorch and it doesn't make any difference, v2.1.0, v2.2.0, v2.3.1.

c++ --version
c++ (Debian 12.2.0-14) 12.2.0

Error message on $ cargo run,

  cargo:warning=                 from /home/gseg/lib/libtorch/include/torch/csrc/api/include/torch/autograd.h:5,
  cargo:warning=                 from /home/gseg/lib/libtorch/include/torch/csrc/api/include/torch/all.h:7,
  cargo:warning=                 from /home/gseg/lib/libtorch/include/torch/csrc/api/include/torch/torch.h:3,
  cargo:warning=                 from libtch/torch_api.h:6:
  cargo:warning=/home/gseg/lib/libtorch/include/ATen/ops/_cslt_sparse_mm.h:26:173: note: in passing argument 4 of ‘at::Tensor at::_cslt_sparse_mm(const Tensor&, const Tensor&, const std::optional<Tensor>&, const std::optional<Tensor>&, std::optional<c10::ScalarType>, bool)’
  cargo:warning=   26 | inline at::Tensor _cslt_sparse_mm(const at::Tensor & compressed_A, const at::Tensor & dense_B, const c10::optional<at::Tensor> & bias={}, const c10::optional<at::Tensor> & alpha={}, c10::optional<at::ScalarType> out_dtype=c10::nullopt, bool transpose_result=false) {
  cargo:warning=      |                                                                                                                                           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~
  exit status: 0
  exit status: 1
  cargo:warning=ToolExecError: Command "c++" "-O0" "-ffunction-sections" "-fdata-sections" "-fPIC" "-gdwarf-4" "-fno-omit-frame-pointer" "-m64" "-I" "/home/gseg/lib/libtorch/include" "-I" "/home/gseg/lib/libtorch/include/torch/csrc/api/include" "-Wl,-rpath=/home/gseg/lib/libtorch/lib" "-std=c++17" "-D_GLIBCXX_USE_CXX11_ABI=1" "-o" "/home/gseg/code/bert-test/target/debug/build/torch-sys-3b0471bf716afe6c/out/19072f24a82f85ae-torch_api_generated.o" "-c" "libtch/torch_api_generated.cpp" with args c++ did not execute successfully (status code exit status: 1).
  exit status: 0

  --- stderr


  error occurred: Command "c++" "-O0" "-ffunction-sections" "-fdata-sections" "-fPIC" "-gdwarf-4" "-fno-omit-frame-pointer" "-m64" "-I" "/home/gseg/lib/libtorch/include" "-I" "/home/gseg/lib/libtorch/include/torch/csrc/api/include" "-Wl,-rpath=/home/gseg/lib/libtorch/lib" "-std=c++17" "-D_GLIBCXX_USE_CXX11_ABI=1" "-o" "/home/gseg/code/bert-test/target/debug/build/torch-sys-3b0471bf716afe6c/out/19072f24a82f85ae-torch_api_generated.o" "-c" "libtch/torch_api_generated.cpp" with args c++ did not execute successfully (status code exit status: 1).

Code below

src/main.rs
extern crate anyhow;
use rust_bert::{
    gpt_neo::{
        GptNeoConfigResources, GptNeoMergesResources, GptNeoModelResources, GptNeoVocabResources,
    },
    pipelines::common::{ModelResource, ModelType},
    pipelines::text_generation::{TextGenerationConfig, TextGenerationModel},
    resources::RemoteResource,
};

fn main() -> anyhow::Result<()> {
    let model_resource = Box::new(RemoteResource::from_pretrained(
        GptNeoModelResources::GPT_NEO_2_7B,
    ));
    let config_resource = Box::new(RemoteResource::from_pretrained(
        GptNeoConfigResources::GPT_NEO_2_7B,
    ));
    let vocab_resource = Box::new(RemoteResource::from_pretrained(
        GptNeoVocabResources::GPT_NEO_2_7B,
    ));
    let merges_resource = Box::new(RemoteResource::from_pretrained(
        GptNeoMergesResources::GPT_NEO_2_7B,
    ));

    let generate_config = TextGenerationConfig {
        model_type: ModelType::GPTNeo,
        model_resource: ModelResource::Torch(model_resource),
        config_resource,
        vocab_resource,
        merges_resource: Some(merges_resource),
        num_beams: 5,
        no_repeat_ngram_size: 2,
        max_length: Some(100),
        ..Default::default()
    };
    let model = TextGenerationModel::new(generate_config).unwrap();

    loop {
        let mut line = String::new();
        std::io::stdin().read_line(&mut line).unwrap();
        let split = line.split('/').collect::<Vec<&str>>();
        let slc = split.as_slice();
        let output = model.generate(&slc[1..], Some(slc[0]));
        for sentence in output? {
            println!("{:?}", sentence);
        }
        return Ok(());
    }
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant