Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)

Overview

rust-bert

Build Status Latest version Documentation License

Rust native Transformer-based models implementation. Port of Hugging Face's Transformers library, using the tch-rs crate and pre-processing from rust-tokenizers. Supports multi-threaded tokenization and GPU inference. This repository exposes the model base architecture, task-specific heads (see below) and ready-to-use pipelines. Benchmarks are available at the end of this document.

The following models are currently implemented:

Sequence classification Token classification Question answering Text Generation Summarization Translation Masked LM
DistilBERT
MobileBERT
BERT
RoBERTa
GPT
GPT2
BART
Marian
Electra
ALBERT
T5
XLNet
Reformer
ProphetNet
Longformer

Ready-to-use pipelines

Based on Hugging Face's pipelines, ready to use end-to-end NLP pipelines are available as part of this crate. The following capabilities are currently available:

Disclaimer The contributors of this repository are not responsible for any generation from the 3rd party utilization of the pretrained systems proposed herein.

1. Question Answering

Extractive question answering from a given question and context. DistilBERT model fine-tuned on SQuAD (Stanford Question Answering Dataset)

    let qa_model = QuestionAnsweringModel::new(Default::default())?;
                                                        
    let question = String::from("Where does Amy live ?");
    let context = String::from("Amy lives in Amsterdam");

    let answers = qa_model.predict(&[QaInput { question, context }], 1, 32);

Output:

[Answer { score: 0.9976814985275269, start: 13, end: 21, answer: "Amsterdam" }]

2. Translation

Translation using the MarianMT architecture and pre-trained models from the Opus-MT team from Language Technology at the University of Helsinki. Currently supported languages are :

  • English <-> French
  • English <-> Spanish
  • English <-> Portuguese
  • English <-> Italian
  • English <-> Catalan
  • English <-> German
  • English <-> Russian
  • English <-> Chinese (Simplified)
  • English <-> Chinese (Traditional)
  • English <-> Dutch
  • English <-> Swedish
  • English <-> Arabic
  • English <-> Hebrew
  • English <-> Hindi
  • French <-> German
    let translation_config = TranslationConfig::new(Language::EnglishToFrench, Device::cuda_if_available());
    let mut model = TranslationModel::new(translation_config)?;
                                                        
    let input = ["This is a sentence to be translated"];
    let output = model.translate(&input);

Output:

Il s'agit d'une phrase à traduire

3. Summarization

Abstractive summarization using a pretrained BART model.

    let summarization_model = SummarizationModel::new(Default::default())?;
                                                        
    let input = ["In findings published Tuesday in Cornell University's arXiv by a team of scientists \
from the University of Montreal and a separate report published Wednesday in Nature Astronomy by a team \
from University College London (UCL), the presence of water vapour was confirmed in the atmosphere of K2-18b, \
a planet circling a star in the constellation Leo. This is the first such discovery in a planet in its star's \
habitable zone — not too hot and not too cold for liquid water to exist. The Montreal team, led by Björn Benneke, \
used data from the NASA's Hubble telescope to assess changes in the light coming from K2-18b's star as the planet \
passed between it and Earth. They found that certain wavelengths of light, which are usually absorbed by water, \
weakened when the planet was in the way, indicating not only does K2-18b have an atmosphere, but the atmosphere \
contains water in vapour form. The team from UCL then analyzed the Montreal team's data using their own software \
and confirmed their conclusion. This was not the first time scientists have found signs of water on an exoplanet, \
but previous discoveries were made on planets with high temperatures or other pronounced differences from Earth. \
\"This is the first potentially habitable planet where the temperature is right and where we now know there is water,\" \
said UCL astronomer Angelos Tsiaras. \"It's the best candidate for habitability right now.\" \"It's a good sign\", \
said Ryan Cloutier of the Harvard–Smithsonian Center for Astrophysics, who was not one of either study's authors. \
\"Overall,\" he continued, \"the presence of water in its atmosphere certainly improves the prospect of K2-18b being \
a potentially habitable planet, but further observations will be required to say for sure. \"
K2-18b was first identified in 2015 by the Kepler space telescope. It is about 110 light-years from Earth and larger \
but less dense. Its star, a red dwarf, is cooler than the Sun, but the planet's orbit is much closer, such that a year \
on K2-18b lasts 33 Earth days. According to The Guardian, astronomers were optimistic that NASA's James Webb space \
telescope — scheduled for launch in 2021 — and the European Space Agency's 2028 ARIEL program, could reveal more \
about exoplanets like K2-18b."];

    let output = summarization_model.summarize(&input);

(example from: WikiNews)

Output:

"Scientists have found water vapour on K2-18b, a planet 110 light-years from Earth. 
This is the first such discovery in a planet in its star's habitable zone. 
The planet is not too hot and not too cold for liquid water to exist."

4. Dialogue Model

Conversation model based on Microsoft's DialoGPT. This pipeline allows the generation of single or multi-turn conversations between a human and a model. The DialoGPT's page states that

The human evaluation results indicate that the response generated from DialoGPT is comparable to human response quality under a single-turn conversation Turing test. (DialoGPT repository)

The model uses a ConversationManager to keep track of active conversations and generate responses to them.

use rust_bert::pipelines::conversation::{ConversationModel, ConversationManager};

let conversation_model = ConversationModel::new(Default::default());
let mut conversation_manager = ConversationManager::new();

let conversation_id = conversation_manager.create("Going to the movies tonight - any suggestions?");
let output = conversation_model.generate_responses(&mut conversation_manager);

Example output:

"The Big Lebowski."

5. Natural Language Generation

Generate language based on a prompt. GPT2 and GPT available as base models. Include techniques such as beam search, top-k and nucleus sampling, temperature setting and repetition penalty. Supports batch generation of sentences from several prompts. Sequences will be left-padded with the model's padding token if present, the unknown token otherwise. This may impact the results, it is recommended to submit prompts of similar length for best results

    let model = GPT2Generator::new(Default::default())?;
                                                        
    let input_context_1 = "The dog";
    let input_context_2 = "The cat was";

    let output = model.generate(Some(&[input_context_1, input_context_2]), 0, 30, true, false, 
                                5, 1.2, 0, 0.9, 1.0, 1.0, 3, 3, None);

Example output:

[
    "The dog's owners, however, did not want to be named. According to the lawsuit, the animal's owner, a 29-year"
    "The dog has always been part of the family. \"He was always going to be my dog and he was always looking out for me"
    "The dog has been able to stay in the home for more than three months now. \"It's a very good dog. She's"
    "The cat was discovered earlier this month in the home of a relative of the deceased. The cat\'s owner, who wished to remain anonymous,"
    "The cat was pulled from the street by two-year-old Jazmine.\"I didn't know what to do,\" she said"
    "The cat was attacked by two stray dogs and was taken to a hospital. Two other cats were also injured in the attack and are being treated."
]

6. Zero-shot classification

Performs zero-shot classification on input sentences with provided labels using a model fine-tuned for Natural Language Inference.

    let sequence_classification_model = ZeroShotClassificationModel::new(Default::default())?;

    let input_sentence = "Who are you voting for in 2020?";
    let input_sequence_2 = "The prime minister has announced a stimulus package which was widely criticized by the opposition.";
    let candidate_labels = &["politics", "public health", "economics", "sports"];

    let output = sequence_classification_model.predict_multilabel(
        &[input_sentence, input_sequence_2],
        candidate_labels,
        None,
        128,
    );

Output:

[
  [ Label { "politics", score: 0.972 }, Label { "public health", score: 0.032 }, Label {"economics", score: 0.006 }, Label {"sports", score: 0.004 } ],
  [ Label { "politics", score: 0.975 }, Label { "public health", score: 0.0818 }, Label {"economics", score: 0.852 }, Label {"sports", score: 0.001 } ],
]

7. Sentiment analysis

Predicts the binary sentiment for a sentence. DistilBERT model fine-tuned on SST-2.

    let sentiment_classifier = SentimentModel::new(Default::default())?;
                                                        
    let input = [
        "Probably my all-time favorite movie, a story of selflessness, sacrifice and dedication to a noble cause, but it's not preachy or boring.",
        "This film tried to be too many things all at once: stinging political satire, Hollywood blockbuster, sappy romantic comedy, family values promo...",
        "If you like original gut wrenching laughter you will like this movie. If you are young or old then you will love this movie, hell even my mom liked it.",
    ];

    let output = sentiment_classifier.predict(&input);

(Example courtesy of IMDb)

Output:

[
    Sentiment { polarity: Positive, score: 0.9981985493795946 },
    Sentiment { polarity: Negative, score: 0.9927982091903687 },
    Sentiment { polarity: Positive, score: 0.9997248985164333 }
]

8. Named Entity Recognition

Extracts entities (Person, Location, Organization, Miscellaneous) from text. BERT cased large model fine-tuned on CoNNL03, contributed by the MDZ Digital Library team at the Bavarian State Library. Models are currently available for English, German, Spanish and Dutch.

    let ner_model = NERModel::new(default::default())?;

    let input = [
        "My name is Amy. I live in Paris.",
        "Paris is a city in France."
    ];
    
    let output = ner_model.predict(&input);

Output:

[
  [
    Entity { word: "Amy", score: 0.9986, label: "I-PER" }
    Entity { word: "Paris", score: 0.9985, label: "I-LOC" }
  ],
  [
    Entity { word: "Paris", score: 0.9988, label: "I-LOC" }
    Entity { word: "France", score: 0.9993, label: "I-LOC" }
  ]
]

8. Part of Speech tagging

Extracts Part of Speech tags (Noun, Verb, Adjective...) from text.

    let ner_model = NERModel::new(default::default())?;

    let input = ["My name is Bob];
    
    let output = ner_model.predict(&input);

Output:

[
    Entity { word: "My", score: 0.1560, label: "PRP" }
    Entity { word: "name", score: 0.6565, label: "NN" }
    Entity { word: "is", score: 0.3697, label: "VBZ" }
    Entity { word: "Bob", score: 0.7460, label: "NNP" }
]

Benchmarks

For simple pipelines (sequence classification, tokens classification, question answering) the performance between Python and Rust is expected to be comparable. This is because the most expensive part of these pipeline is the language model itself, sharing a common implementation in the Torch backend. The End-to-end NLP Pipelines in Rust provides a benchmarks section covering all pipelines.

For text generation tasks (summarization, translation, conversation, free text generation), significant benefits can be expected (up to 2 to 4 times faster processing depending on the input and application). The article Accelerating text generation with Rust focuses on these text generation applications and provides more details on the performance comparison to Python.

Base models

The base model and task-specific heads are also available for users looking to expose their own transformer based models. Examples on how to prepare the date using a native tokenizers Rust library are available in ./examples for BERT, DistilBERT, RoBERTa, GPT, GPT2 and BART. Note that when importing models from Pytorch, the convention for parameters naming needs to be aligned with the Rust schema. Loading of the pre-trained weights will fail if any of the model parameters weights cannot be found in the weight files. If this quality check is to be skipped, an alternative method load_partial can be invoked from the variables store.

Setup

A number of pretrained model configuration, weights and vocabulary are downloaded directly from Hugging Face's model repository. The list of models available with Rust-compatible weights is available at https://huggingface.co/models?filter=rust. The models will be downloaded to the environment variable RUSTBERT_CACHE if it exists, otherwise to ~/.cache/.rustbert. Additional models can be added if of interest, please raise an issue.

In order to load custom weights to the library, these need to be converter to a binary format that can be read by Libtorch (the original .bin files are pickles and cannot be used directly). Several Python scripts to load Pytorch weights and convert them to the appropriate format are provided and can be adapted based on the model needs.

  1. Compile the package: cargo build
  2. Download the model files & perform necessary conversions
    • Set-up a virtual environment and install dependencies
    • Download the Pytorch model of interest (pytorch_model.bin from Hugging Face's model repository)
    • run the conversion script python /utils/convert_model.py <PATH_TO_PYTORCH_WEIGHTS>.

Citation

If you use rust-bert for your work, please cite End-to-end NLP Pipelines in Rust:

@inproceedings{becquin-2020-end,
    title = "End-to-end {NLP} Pipelines in Rust",
    author = "Becquin, Guillaume",
    booktitle = "Proceedings of Second Workshop for NLP Open Source Software (NLP-OSS)",
    year = "2020",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/2020.nlposs-1.4",
    pages = "20--25",
}

Acknowledgements

Thank you to Hugging Face for hosting a set of weights compatible with this Rust library. The list of ready-to-use pretrained models is listed at https://huggingface.co/models?filter=rust.

Comments
  • Panic on `sequence_history.drain` conversation.rs

    Panic on `sequence_history.drain` conversation.rs

    So I have an odd one. I am writing a small chat bot using the conversation model. All is good for the first 3 responses. I write and the bot responds in less than a second. However, always on the 4th response it takes a long time to respond and then panics at:

       9: std::panicking::default_hook                                                                                                                                         [35/22742]
                 at src/libstd/panicking.rs:218
      10: std::panicking::rust_panic_with_hook
                 at src/libstd/panicking.rs:486
      11: rust_begin_unwind
                 at src/libstd/panicking.rs:388
      12: core::panicking::panic_fmt
      13: alloc::vec::Vec<T>::drain::start_assert_failed
                 at src/libcore/../stdarch/crates/core_arch/src/macros.rs:403
      14: alloc::vec::Vec<T>::drain
                 at /rustc/c367798cfd3817ca6ae908ce675d1d99242af148/src/liballoc/vec.rs:1335
      15: rust_bert::pipelines::conversation::ConversationModel::clean_padding_indices
                 at ~/.cargo/registry/src/github.com-1ecc6299db9ec823/rust-bert-0.10.0/src/pipelines/conversation.rs:720
      16: rust_bert::pipelines::conversation::ConversationModel::generate_responses
                 at ~/.cargo/registry/src/github.com-1ecc6299db9ec823/rust-bert-0.10.0/src/pipelines/conversation.rs:678
    

    Which is on sequence_history.drain

    fn clean_padding_indices(&self, model_output: &mut Vec<Vec<i64>>) -> Vec<(usize, usize)> {
            // In case inputs are sent as batch, this cleans the padding indices in the history for shorter outputs
            let pad_token = self.model.get_pad_id().unwrap_or(self.eos_token_id);
            let mut removed_tokens = Vec::with_capacity(model_output.len());
            for sequence_history in model_output {
                let index_end = sequence_history
                    .iter()
                    .rev()
                    .position(|&r| r != pad_token)
                    .unwrap();
                let index_start = sequence_history
                    .iter()
                    .position(|&r| r != pad_token)
                    .unwrap();
                sequence_history.drain(sequence_history.len() - index_end + 1..); // <- PANICS HERE
                sequence_history.drain(..index_start);
                removed_tokens.push((index_start, index_end));
            }
            removed_tokens
        }
    

    Some advice would be appreciate, can probably clone and compile myself if you want me to do any testing.

    Im not sure where this is coming from and why it is always the 4th response....

    opened by QuantumEntangledAndy 19
  • add cached-path file utils

    add cached-path file utils

    closes #72

    So far I've just added a common/file_utils.rs module. The idea being that this would replace what's currently in common/resources.rs.

    My question for you is, do you want to keep the LocalResource and RemoteResource abstractions?

    opened by epwalsh 15
  • GPT2 model for T5Tokenizer does not work.

    GPT2 model for T5Tokenizer does not work.

    GPT2 1.3B, A 24-layer, 2048-hidden-size transformer-based language model vocabulary resource file is spiece.model

    I have converted pytorch_model.bin to rust_model.ot successfully. I modified an examples/summarization_t5.rs file as testing template.

    Error: Tokenizer error: Error when loading vocabulary file, the file may be corrupted or does not match the expected format: unexpected wire type

    When I use python, it works fine. For example.

    import torch from transformers import T5Tokenizer, AutoModelForCausalLM

    tokenizer = T5Tokenizer.from_pretrained("rinna/japanese-gpt-1b") model = AutoModelForCausalLM.from_pretrained("rinna/japanese-gpt-1b") #model = GPT2LMHeadModel.from_pretrained("rinna/japanese-gpt-1b")

    if torch.cuda.is_available(): model = model.to("cuda")

    text = "夏目漱石は、" token_ids = tokenizer.encode(text, add_special_tokens=False, return_tensors="pt") with torch.no_grad(): output_ids = model.generate( token_ids.to(model.device), max_length=100, min_length=100, do_sample=True, top_k=500, top_p=0.95, pad_token_id=tokenizer.pad_token_id, bos_token_id=tokenizer.bos_token_id, eos_token_id=tokenizer.eos_token_id, bad_word_ids=[[tokenizer.unk_token_id]] )

    output = tokenizer.decode(output_ids.tolist()[0]) print(output)

    Not yet implemented for this purpose and feature ?

    opened by ycat3 13
  • How to use grpc for gpt2 generation?

    How to use grpc for gpt2 generation?

    The gpt2 model is defined as a mutable model, so when I'm using the Rust-Grpc server, I will always get a *mut torch_sys::C_tensor cannot be shared between threads safely error.

    It's not advisable to load model for each call, what i need is just loading once, then every thread of grpc could use the model to generate text.

    Is there a proper advice for that? tks.

    opened by hscspring 13
  • prop:Help:StartUp

    prop:Help:StartUp

    I have a simple GPT2 model on https://huggingface.co/remotejob/tweetsGPT2fi_v1/tree/main rust_model.ot uploaded Pls. give me some idea for a startup.

    my python version working correctly

    from transformers import AutoModelWithLMHead, AutoTokenizer, pipeline model = AutoModelWithLMHead.from_pretrained("remotejob/tweetsGPT2fi_v1") tokenizer = AutoTokenizer.from_pretrained("remotejob/tweetsGPT2fi_v1") generator= pipeline('text-generation', model=model, tokenizer=tokenizer) res = generator("Kuka sei") print(res)

    opened by remotejob 12
  • Initial attempt at a REST API

    Initial attempt at a REST API

    Thanks for this project, it's really helped me out.

    So I thought I'd share my crude attempt at creating an API for this project. I'm just using it for the sentiment at the moment, but I'm planning to expand it to some other functions too. I'm posting here just in case anyone else is interested, or if anyone has any suggestions. Eg; I wonder if I'm instantiating SentimentModel in the best place to share memory usage between requests?

    Maybe there's even interest in including this in the project here, if so I'd be happy to help.

    https://github.com/tombh/rust-bert-api

    opened by tombh 12
  • ConversationModel panic on large input (with 3-4 lines of text)

    ConversationModel panic on large input (with 3-4 lines of text)

    thread 'tokio-runtime-worker' panicked at 'called Result::unwrap() on an Err value: Torch("index out of range in self\nException raised from operator() at ../aten/src/ATen/native/TensorAdvancedIndexing.cpp:677 (most recent call first):\nframe #0: c10::Error::Error(c10::SourceLocation, std::__cxx11::basic_string<char, std::char_traits, std::allocator >) + 0x69 (0x7f634c696b29 in Amadeus/target/release/build/torch-sys-8dc810d88f246fa1/out/libtorch/libtorch/lib/libc10.so)\nframe #1: + 0xc12390 (0x7f634d4e5390 in Amadeus/target/release/build/torch-sys-8dc810d88f246fa1/out/libtorch/libtorch/lib/libtorch_cpu.so)\nframe #2: + 0x10e5d4a (0x7f634d9b8d4a in Amadeus/target/release/build/torch-sys-8dc810d88f246fa1/out/libtorch/libtorch/lib/libtorch_cpu.so)\nframe #3: + 0x1b766 (0x7f634c61e766 in gcc/x86_64-pc-linux-gnu/11.1.0/libgomp.so.1)\nframe #4: + 0x7e3e (0x7f634a1d5e3e in /lib64/libpthread.so.0)\nframe #5: clone + 0x3f (0x7f6349fd173f in libc.so.6)\n")', .cargo/registry/src/github.com-1ecc6299db9ec823/tch-0.4.1/src/wrappers/tensor_generated.rs:3649:87

    opened by Miezhiko 11
  • More generic token classification pipeline supporting multiple models

    More generic token classification pipeline supporting multiple models

    Salut @guillaume-be!

    Pour commencer, merci beaucoup pour ce logiciel formidable que tu as crée!

    This draft pull request contains my attempt to build a more generic token classification pipeline. I realized that your NER pipeline (src/pipelines/ner.rs) was hard coded to use Bert (with the conll03 models hard-coded as well) but I am in need of something more flexible/generic that can accommodate all token classification models you implemented (bert, roberta, distilbert) and with whatever tokenizer/config configuration they are suited for. I felt such a pipeline would fit best in rust-bert itself. Would you be open to such a contribution?

    What I did is the following:

    • I started with a copy of your ner.rs, but moved away from all the NER-centric naming as all this serves PoS tagging and other uses just as well. The idea is that token_classification.rs could eventually replace ner.rs if you agree; for now I implemented it as an independent copy though.
    • I built an abstractions interface over various *Configs, *Tokenizers and *ForTokenClassification models so they can be switched as needed, a kind of polymorphism if you will.
    • The abstractions are currently implemented through enums (ConfigOption, TokenizerOption and TokenClassificationOption) rather than through traits and/or generics, as the latter option would be far more invasive to the rest of the library and things could get fairly complicated quickly. I wonder what your view is on this?
    • I created one TokenClassificationModel (my adaptation of your NERModel) which now works with the above abstracted interfaces. (TokenClassificationConfiguration is the adaptation of your NERConfig)

    Most of the work can also be applied to the other pipelines I think, but I haven't gone there yet, it will require some further reorganization probably.

    By the way, I'm working on a CLI tool that is using your library (with my changes) to make available various NLP pipelines (and eventually more) for end-users: https://github.com/proycon/deepfrog (pre-alpha!).

    Note: this is a draft pull request, it's not ready for merge yet even though 'it works', I might still tweak some stuff but I wanted to already open the discussion to see if you were open to such new features and to hear any input you may have

    opened by proycon 10
  • Add sbert implementation for inference

    Add sbert implementation for inference

    This PR adds a generic implementation for Sentence-Transformers inference #185.

    The main components are:

    • The SBertModel struct templated by T: SBertTransformer (which is a sealed trait)
    • Layers are defined based on modules.json (support for pooler and optional dense/normalization)
    • Implementations of SBertTransformer:
      • UsingAlbert, UsingBert , UsingDistilBert, UsingRoberta, UsingT5
      • All tested against results from Python implementations available on Hugging Face Hub
      • Tests are marked with #[ignore] as they require to download and convert the models beforehand (instructions are in the tests comments)
    • The script utils/convert_model.py has new parameters to be able to convert sbert models to Rust.

    This PR is marked as draft because:

    • Currently UsingT5 is not working as T5Model always define a decoder and sbert models contain only an encoder. I don't know what would be the best approach for "encoder only T5 model", do you have any preference/insight ?
    • Maybe there are more implementations to add ? If there are more models to support I can add them.
    • Maybe SBertTransformer doesn't need to be sealed, what do you think ?
    • Attention in UsingAlbert is averaged across the hidden groups to have the type Vec<Tensor> (instead of Vec<Vec<Tensor>>), which is the type of all other implementations. Is this OK for you ?
    opened by lerouxrgd 9
  • Failed to run cargo run on Linux

    Failed to run cargo run on Linux

    Hello @guillaume-be

    I'm trying to run the example of rust-bert and It's looking like I have issues with torch-sys.

    running cargo run --example generation --verbose gives me the following errors:

           Fresh autocfg v1.0.1
           Fresh cfg-if v1.0.0
           Fresh cc v1.0.67
           Fresh pkg-config v0.3.19
           Fresh unicode-xid v0.2.2
           Fresh lazy_static v1.4.0
           Fresh itoa v0.4.7
           Fresh byteorder v1.4.3
           Fresh pin-project-lite v0.2.6
           Fresh bytes v1.0.1
           Fresh openssl-probe v0.1.2
           Fresh adler v1.0.2
           Fresh ahash v0.4.7
           Fresh futures-core v0.3.14
           Fresh tinyvec_macros v0.1.0
           Fresh foreign-types-shared v0.1.1
           Fresh ppv-lite86 v0.2.10
           Fresh matches v0.1.8
           Fresh fnv v1.0.7
           Fresh version_check v0.9.3
           Fresh pin-utils v0.1.0
           Fresh once_cell v1.7.2
           Fresh futures-io v0.3.14
           Fresh unicode-width v0.1.8
           Fresh either v1.6.1
           Fresh regex-syntax v0.6.25
           Fresh slab v0.4.3
           Fresh futures-sink v0.3.14
           Fresh scopeguard v1.1.0
           Fresh futures-task v0.3.14
           Fresh semver-parser v0.7.0
           Fresh percent-encoding v2.1.0
           Fresh try-lock v0.2.3
           Fresh httpdate v1.0.0
           Fresh tower-service v0.3.1
           Fresh rawpointer v0.2.1
           Fresh half v1.7.1
           Fresh smallvec v1.6.1
           Fresh opaque-debug v0.3.0
           Fresh ipnet v2.3.0
           Fresh cpuid-bool v0.1.2
           Fresh remove_dir_all v0.5.3
           Fresh plotters-backend v0.3.0
           Fresh mime v0.3.16
           Fresh base64 v0.13.0
           Fresh number_prefix v0.3.0
           Fresh same-file v1.0.6
           Fresh glob v0.3.0
           Fresh oorandom v11.1.3
           Fresh tracing-core v0.1.18
           Fresh cmake v0.1.45
           Fresh regex-automata v0.1.9
           Fresh hashbrown v0.9.1
           Fresh foreign-types v0.3.2
           Fresh tinyvec v1.2.0
           Fresh futures-channel v0.3.14
           Fresh unicode-bidi v0.3.5
           Fresh http v0.2.4
           Fresh itertools v0.10.0
           Fresh textwrap v0.11.0
           Fresh itertools v0.9.0
           Fresh semver v0.9.0
           Fresh form_urlencoded v1.0.1
           Fresh matrixmultiply v0.2.4
           Fresh libc v0.2.94
           Fresh proc-macro2 v1.0.26
           Fresh unicode-normalization-alignments v0.1.12
           Fresh plotters-svg v0.3.0
           Fresh walkdir v2.3.2
           Fresh memchr v2.4.0
           Fresh log v0.4.14
           Fresh bitflags v1.2.1
           Fresh ryu v1.0.5
           Fresh crc32fast v1.2.1
           Fresh typenum v1.13.0
           Fresh tracing v0.1.26
           Fresh unicode-normalization v0.1.17
           Fresh anyhow v1.0.40
           Fresh http-body v0.4.1
           Fresh httparse v1.4.0
           Fresh rustc_version v0.2.3
           Fresh encoding_rs v0.8.28
           Fresh quote v1.0.9
           Fresh openssl-sys v0.9.62
           Fresh num_cpus v1.13.0
           Fresh socket2 v0.4.0
           Fresh mio v0.7.11
           Fresh bzip2-sys v0.1.10+1.0.8
           Fresh miniz_oxide v0.4.4
           Fresh crossbeam-utils v0.8.4
           Fresh time v0.1.43
           Fresh getrandom v0.2.2
           Fresh libz-sys v1.1.3
           Fresh aho-corasick v0.7.18
           Fresh futures-util v0.3.14
           Fresh getrandom v0.1.16
           Fresh want v0.3.0
           Fresh csv-core v0.1.10
           Fresh terminal_size v0.1.16
           Fresh xattr v0.2.2
           Fresh filetime v0.2.14
           Fresh protobuf v2.22.1
           Fresh fs2 v0.4.3
           Fresh dirs-sys v0.3.6
           Fresh atty v0.2.14
           Fresh num-traits v0.2.14
           Fresh memoffset v0.6.3
           Fresh indexmap v1.6.2
           Fresh generic-array v0.14.4
           Fresh idna v0.2.3
           Fresh clap v2.33.3
           Fresh syn v1.0.72
           Fresh bzip2 v0.3.3
           Fresh flate2 v1.0.20
           Fresh tokio v1.5.0
           Fresh rand_core v0.6.2
           Fresh crossbeam-channel v0.5.1
           Fresh uuid v0.8.2
           Fresh curl-sys v0.4.42+curl-7.76.0
           Fresh openssl v0.10.34
           Fresh regex v1.5.3
           Fresh rand_core v0.5.1
           Fresh tar v0.4.33
           Fresh dirs v3.0.2
           Fresh serde_derive v1.0.125
           Fresh thiserror-impl v1.0.24
           Fresh pin-project-internal v1.0.7
           Fresh crossbeam-epoch v0.9.4
           Fresh tokio-util v0.6.6
           Fresh rand_chacha v0.3.0
           Fresh digest v0.9.0
           Fresh block-buffer v0.9.0
           Fresh url v2.2.1
           Fresh num-complex v0.2.4
           Fresh num-integer v0.1.44
           Fresh ordered-float v2.2.0
           Fresh plotters v0.3.0
           Fresh serde v1.0.125
           Fresh thiserror v1.0.24
           Fresh native-tls v0.2.7
           Fresh crossbeam-deque v0.8.0
           Fresh curl v0.4.36
           Fresh pin-project v1.0.7
           Fresh h2 v0.3.3
           Fresh console v0.14.1
           Fresh rand v0.8.3
           Fresh rand_chacha v0.2.2
           Fresh sha2 v0.9.3
           Fresh cast v0.2.5
           Fresh zip v0.5.12
           Fresh serde_json v1.0.64
           Fresh hyper v0.14.7
           Fresh rayon-core v1.9.0
           Fresh bstr v0.2.16
           Fresh tokio-native-tls v0.3.0
           Fresh serde_urlencoded v0.7.0
           Fresh ndarray v0.13.1
           Fresh indicatif v0.15.0
           Fresh rand v0.7.3
           Fresh tempfile v3.2.0
           Fresh serde_cbor v0.11.1
           Fresh hyper-tls v0.5.0
           Fresh csv v1.1.6
           Fresh rayon v1.5.0
           Fresh zip-extensions v0.6.0
           Fresh criterion-plot v0.4.3
           Fresh tinytemplate v1.2.1
           Fresh reqwest v0.11.3
           Fresh rust_tokenizers v6.2.2
           Fresh torch-sys v0.3.1
           Fresh cached-path v0.5.1
           Fresh criterion v0.3.4
           Fresh tch v0.3.1
       Compiling rust-bert v0.14.0 (/home/mathis/Dev/rust-bert)
         Running `rustc --crate-name generation --edition=2018 examples/generation.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C embed-bitcode=no -C debuginfo=2 -C metadata=1079cbdd0923672d -C extra-filename=-1079cbdd0923672d --out-dir /home/mathis/Dev/rust-bert/target/debug/examples -C incremental=/home/mathis/Dev/rust-bert/target/debug/incremental -L dependency=/home/mathis/Dev/rust-bert/target/debug/deps --extern anyhow=/home/mathis/Dev/rust-bert/target/debug/deps/libanyhow-deb63606ff1cfc02.rlib --extern cached_path=/home/mathis/Dev/rust-bert/target/debug/deps/libcached_path-ed4f91024ed4fc3e.rlib --extern criterion=/home/mathis/Dev/rust-bert/target/debug/deps/libcriterion-11283086a0db42c9.rlib --extern csv=/home/mathis/Dev/rust-bert/target/debug/deps/libcsv-c5db6ce464e9affd.rlib --extern dirs=/home/mathis/Dev/rust-bert/target/debug/deps/libdirs-d779fef0aee33dfd.rlib --extern lazy_static=/home/mathis/Dev/rust-bert/target/debug/deps/liblazy_static-cd2401a06b1676f6.rlib --extern ordered_float=/home/mathis/Dev/rust-bert/target/debug/deps/libordered_float-9d94cc6b3ed64167.rlib --extern rust_bert=/home/mathis/Dev/rust-bert/target/debug/deps/librust_bert-2ec9c634d44292b9.rlib --extern rust_tokenizers=/home/mathis/Dev/rust-bert/target/debug/deps/librust_tokenizers-d60a711620946171.rlib --extern serde=/home/mathis/Dev/rust-bert/target/debug/deps/libserde-3f736418ac579112.rlib --extern serde_json=/home/mathis/Dev/rust-bert/target/debug/deps/libserde_json-56cb869844da73de.rlib --extern tch=/home/mathis/Dev/rust-bert/target/debug/deps/libtch-3489097ebbda946c.rlib --extern tempfile=/home/mathis/Dev/rust-bert/target/debug/deps/libtempfile-cb2a539468ebc1f7.rlib --extern thiserror=/home/mathis/Dev/rust-bert/target/debug/deps/libthiserror-849666d06ac7e440.rlib --extern torch_sys=/home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib --extern uuid=/home/mathis/Dev/rust-bert/target/debug/deps/libuuid-bf8e4fb0cc067046.rlib -L native=/usr/lib -L native=/home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib -L native=/home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out`
    error: linking with `cc` failed: exit code: 1
      |
      = note: "cc" "-Wl,--as-needed" "-Wl,-z,noexecstack" "-m64" "-Wl,--eh-frame-hdr" "-L" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.10it0tix8kemlzzz.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.167xi68m3zomqb7.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.1b5qfw6lrgq9ixu9.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.1d0ql6qb0gfr40an.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.1lo4kpl4iw1j0xzu.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.1ltk8i68f9tjta0v.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.1mif4ri8vrsnk1lb.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.1n0sgz8zmfb1yk8e.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.1p2f8f4m2gx4bs7c.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.1pffsc4z38348shw.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.1r7u2nys3lazyjqj.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.1t5v18zbjjusjg3b.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.1vr0f0tzbpd6tsr6.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.232uq55vzhtlx2y0.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.28sfi6qkpnmtd1jv.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.2apsawysjbfj41cw.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.2efxy0h6puoi2k9u.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.2k2r5jq5bv472xuc.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.2ocmatgwqmq8wdin.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.2vu57mhs8b1rdwyd.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.34armeuadq7ou4g7.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.34pc7e99jb7tqj70.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.3b3ohz2lemdo994y.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.3d6prhey778singb.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.3mdf12i52d5gzn59.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.3ohu9xvnsquxze42.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.3pq0r8m6j1ykpyz5.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.3qg9lpklsjsdpcaz.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.3rkm1e4qxh8n4c3l.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.3sp583ijiw4ov2sz.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.3uqpvz1x50awpida.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.3vol26id67ntsbxl.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.3whurftq1sb0v984.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.49z5c534dy9uim1a.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.4ae8f3av8lqccarn.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.4h6k1vvx97o7e6tb.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.4krufbcvoktpje1z.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.4kv6jrdjrkzycgr8.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.4o3sj4udjrggx8mj.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.4wajjfybl1w1bx97.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.52ooz6okopac6bou.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.5cqzl22pfezok8hy.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.5ge8c24exj1sggnk.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.akk6cuaji4u6oih.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.ctpuo9x1x6wcg53.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.odzqcam2z1zbxzn.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.rz5bc76mat974dm.rcgu.o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.t62z9mjebfibll8.rcgu.o" "-o" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d" "/home/mathis/Dev/rust-bert/target/debug/examples/generation-1079cbdd0923672d.3jhp2i5vw46ruhq7.rcgu.o" "-Wl,--gc-sections" "-pie" "-Wl,-zrelro" "-Wl,-znow" "-nodefaultlibs" "-L" "/home/mathis/Dev/rust-bert/target/debug/deps" "-L" "/usr/lib" "-L" "/home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib" "-L" "/home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out" "-L" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib" "-Wl,-Bstatic" "/home/mathis/Dev/rust-bert/target/debug/deps/librust_bert-2ec9c634d44292b9.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libuuid-bf8e4fb0cc067046.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libcached_path-ed4f91024ed4fc3e.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libindicatif-8a0185076a0eb177.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libconsole-c943e32fe490a795.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libterminal_size-29f5eab3f5b081c5.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libunicode_width-85f6c02835e49e35.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libnumber_prefix-f42c307c9580ef59.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtar-fd5427c40075e783.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libxattr-9989c107df9b07c7.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libfiletime-f00939867a7a5d10.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libsha2-b96c29815f69105c.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libcpuid_bool-77e420fc3482a298.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libopaque_debug-e255b5a7231feb45.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libdigest-5e985337ea6e7525.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libblock_buffer-1a5ff49bbac509bb.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libgeneric_array-c6e858b52fea96d4.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtypenum-d38b2426cfe2d60a.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libreqwest-3a20498a91e06669.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libhyper_tls-41c83e33a6039bb3.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libipnet-80aef0b8131aa337.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtokio_native_tls-b2e970837a80eae8.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libserde_urlencoded-737747b42f1af893.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libmime-4ab1ec3a2ae719c1.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libencoding_rs-f79ff60439fd89ce.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libbase64-2eaf2a28e351c3cf.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libnative_tls-aff207f6c062165a.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libopenssl_probe-3fc17c5c1d53a384.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libopenssl-8449b8ae49b7be20.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libonce_cell-e4cdb0ff14231943.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libbitflags-df474a8ca45884a8.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libforeign_types-f70576c32fbfaf7f.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libforeign_types_shared-c71078ab8c7eadf1.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libopenssl_sys-70482dbbf5b1d608.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libhyper-8a00342c18d2c3a1.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libwant-99e08c6aeea13938.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtry_lock-41d3178ef58d6d67.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libsocket2-811a7a403c31caed.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libh2-477112006cb6758a.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libindexmap-b32181fdc7afcc4c.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtokio_util-72f75e6a6423678a.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libfutures_sink-2577d8e12fcf6991.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libhttparse-648ab1cdd3cbae9c.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtower_service-9282d801dbb17df2.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libfutures_channel-35431632760d3376.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libpin_project-a21d56dd1042478a.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtracing-7c279f1a677a4fa0.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtracing_core-a2c6a62ee372e2c8.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtokio-3879a93bf360f2b5.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libmio-56ac13146c4b2860.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libhttp_body-34a615cfe7b2c0c6.rlib" "/home/mathis/Dev/rust-bert/target/
    debug/deps/libfutures_util-b7a30bee20f5fe71.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libfutures_io-f314da869344a9a8.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libslab-dd6edc0ed1ca4f6b.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libpin_project_lite-06d42e9781056991.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libfutures_task-b04b5f0e69e53ffb.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libpin_utils-20f290c37612163f.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libfutures_core-a16351367c097e6e.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/liburl-4c6ee3061aae65b7.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libidna-3dd9383909fbc055.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libunicode_bidi-0fe025d2b861b5b3.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libform_urlencoded-93c34c4329f24067.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libpercent_encoding-bae4c6d884296e4c.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libmatches-b31feef5c3ce1062.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libhttp-241165acd5255fd4.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libbytes-0ff5e007e0c7e08c.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libfnv-eef866999c50c67e.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/liblog-2f3f18ae3a68c2f8.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libglob-c57da5286c7fae6e.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libfs2-a48de2555a8471db.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libzip_extensions-353c3f8d4e9ed7c3.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtempfile-cb2a539468ebc1f7.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/librand-1c75c86b13b5580a.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/librand_chacha-256b875f4a5a3d6b.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/librand_core-d1cdd31427f095d4.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libgetrandom-7dc64231f7b99450.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libremove_dir_all-b44ab79d858f6f76.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/librust_tokenizers-d60a711620946171.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libcsv-c5db6ce464e9affd.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libcsv_core-ebe164817eda8c9b.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libbstr-4668142ce77a56b7.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libregex_automata-8cb0baf15346bda4.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libserde_json-56cb869844da73de.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libryu-bc3a68e8fbaa57e5.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libitoa-f9dbccd7af7d6b3a.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libhashbrown-60d09b3993522df1.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libahash-749bc2fb31b31530.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libprotobuf-c58299b47f54fed5.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libunicode_normalization_alignments-206b7fe2a9ea75f1.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libsmallvec-0f55f5d6a54014e3.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libunicode_normalization-67b65b54ebb28bc8.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtinyvec-2a8082d1c3b39837.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtinyvec_macros-a8a3510ab9d2d66d.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libregex-837648ba3d68bb20.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libaho_corasick-a0f2b910ddf545ab.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libmemchr-abe191a7fe2cb81e.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libregex_syntax-d8c25f7d5a9c013c.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/librayon-2d0d238453171a73.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/librayon_core-f097f6ff0debde4e.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libnum_cpus-db12bd77386e7612.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libcrossbeam_deque-54b2fd51490e9920.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libcrossbeam_epoch-f62cfc549debd93a.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libmemoffset-833e5f62b4ed8191.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libscopeguard-4c9e30f2e7684fc3.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libcrossbeam_channel-041e91dffad4bb57.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libcrossbeam_utils-d38a60b6601ae3a6.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libitertools-4d8dca7ba26edf3b.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libeither-976ba12e28a58b3f.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libserde-3f736418ac579112.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libordered_float-9d94cc6b3ed64167.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtch-3489097ebbda946c.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libndarray-5e50a2ad469eca8a.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libmatrixmultiply-e953bca71897c4d4.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libnum_complex-94ed3827aa717c55.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libnum_integer-1111d064da27e043.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libnum_traits-a8f43e054a44200b.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/librawpointer-5f566245f6baa3b8.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/librand-c96fafd91f6fb884.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/librand_chacha-b3dc7a75b6fcc1e7.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libppv_lite86-954a0753d9a936e0.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/librand_core-d163b764f498228a.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libgetrandom-75286b24b01f3ec6.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libhalf-3df75a818605f784.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libzip-a0e4d565bc7812ee.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libtime-ac2f880c23971580.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libbzip2-2bf74b0aaaed15ea.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libbzip2_sys-aebd69e543b5cce6.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libflate2-55dac049734f889a.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libminiz_oxide-ae9bfcf09048bed3.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libadler-0dfc8a797665414c.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libbyteorder-4126bddb3356bb8d.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libcrc32fast-516f19d8f84806f9.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libcfg_if-2909e55669d95bfc.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libthiserror-849666d06ac7e440.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/liblazy_static-cd2401a06b1676f6.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libdirs-d779fef0aee33dfd.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libdirs_sys-d14c8c2444a3fc5b.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/liblibc-d51a272aa641e225.rlib" "/home/mathis/Dev/rust-bert/target/debug/deps/libanyhow-deb63606ff1cfc02.rlib" "-Wl,--start-group" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libstd-6e0e72ef3f331f94.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libpanic_unwind-eed7c8ea6eea20e8.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libminiz_oxide-637cb1b53c807e95.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libadler-099cf0af4375543b.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libobject-b5f18e83369ef257.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libaddr2line-3bb19daa4485d5fe.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libgimli-5298ab0591e7fb29.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/librustc_demangle-2e4947d254d0b599.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libhashbrown-1532436f783b0405.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/librustc_std_workspace_alloc-3d12d76f5782439f.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libunwind-87f8d20d4e058c86.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libcfg_if-d41f1ff31e4e0f27.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/liblibc-0d5ea4f2d39b8e27.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/liballoc-31288459e6a43502.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/librustc_std_workspace_core-c52e5d6301e1bd59.rlib" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libcore-2675a9a46b5cec89.rlib" "-Wl,--end-group" "/home/mathis/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libcompiler_builtins-f51baad7bbcb81c4.rlib" "-Wl,-Bdynamic" "-lssl" "-lcrypto" "-lstdc++" "-ltorch" "-ltorch_cpu" "-lc10" "-lgomp" "-lbz2" "-lgcc_s" "-lutil" "-lrt" "-lpthread" "-lm" "-ldl" "-lc"
      = note: /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta::TypeMeta()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:457: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<caffe2::detail::_Uninitialized>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<unsigned char>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<unsigned char>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<signed char>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<signed char>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<short>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<short>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<int>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<int>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<long>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<long>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<c10::Half>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::Half>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<float>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<float>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<double>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<double>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<c10::complex<c10::Half> >()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::complex<c10::Half> >()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<c10::complex<float> >()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::complex<float> >()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<c10::complex<double> >()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::complex<double> >()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<bool>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<bool>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<c10::qint8>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::qint8>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<c10::quint8>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::quint8>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<c10::qint32>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::qint32>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib(torch_api.o): in function `caffe2::TypeMeta caffe2::TypeMeta::Make<c10::BFloat16>()':
              /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/include/c10/util/typeid.h:439: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::BFloat16>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `c10::C10FlagsRegistry[abi:cxx11]()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `c10::TensorImpl::TensorImpl(c10::Storage&&, c10::DispatchKeySet, caffe2::TypeMeta const&)'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `c10::TensorImpl::TensorImpl(c10::DispatchKeySet, caffe2::TypeMeta const&, c10::optional<c10::Device>)'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `c10::impl::tls_local_dispatch_key_set()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `bool c10::C10FlagParser::Parse<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >*)'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `c10::TensorImpl::TensorImpl(c10::DispatchKeySet, caffe2::TypeMeta const&, c10::optional<c10::Device>)'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `c10::MessageLogger::MessageLogger(char const*, int, int)'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<char>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `FLAGS_caffe2_keep_on_shrink'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `c10::TensorImpl::TensorImpl(c10::Storage&&, c10::DispatchKeySet, caffe2::TypeMeta const&)'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<std::unique_ptr<std::mutex, std::default_delete<std::mutex> > >()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `c10::MessageLogger::~MessageLogger()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<unsigned short>()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<std::unique_ptr<std::atomic<bool>, std::default_delete<std::atomic<bool> > > >()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `bool c10::C10FlagParser::Parse<int>(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int*)'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `FLAGS_caffe2_max_keep_on_shrink_memory'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >()'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `bool c10::C10FlagParser::Parse<double>(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, double*)'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `bool c10::C10FlagParser::Parse<bool>(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, bool*)'
              /usr/bin/ld: /home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib/libtorch_cpu.so: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<std::vector<int, std::allocator<int> > >()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<long>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::complex<float> >()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::BFloat16>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::complex<double> >()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<bool>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::Half>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<float>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<int>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<double>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::complex<c10::Half> >()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::quint8>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::qint32>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<c10::qint8>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<unsigned char>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<caffe2::detail::_Uninitialized>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<signed char>()'
              /usr/bin/ld: /usr/bin/ld generated: undefined reference to `caffe2::detail::TypeMetaData const* caffe2::TypeMeta::_typeMetaDataInstance<short>()'
              collect2: error: ld returned 1 exit status
              
    
    error: aborting due to previous error
    
    error: could not compile `rust-bert`
    
    Caused by:
      process didn't exit successfully: `rustc --crate-name generation --edition=2018 examples/generation.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type bin --emit=dep-info,link -C embed-bitcode=no -C debuginfo=2 -C metadata=1079cbdd0923672d -C extra-filename=-1079cbdd0923672d --out-dir /home/mathis/Dev/rust-bert/target/debug/examples -C incremental=/home/mathis/Dev/rust-bert/target/debug/incremental -L dependency=/home/mathis/Dev/rust-bert/target/debug/deps --extern anyhow=/home/mathis/Dev/rust-bert/target/debug/deps/libanyhow-deb63606ff1cfc02.rlib --extern cached_path=/home/mathis/Dev/rust-bert/target/debug/deps/libcached_path-ed4f91024ed4fc3e.rlib --extern criterion=/home/mathis/Dev/rust-bert/target/debug/deps/libcriterion-11283086a0db42c9.rlib --extern csv=/home/mathis/Dev/rust-bert/target/debug/deps/libcsv-c5db6ce464e9affd.rlib --extern dirs=/home/mathis/Dev/rust-bert/target/debug/deps/libdirs-d779fef0aee33dfd.rlib --extern lazy_static=/home/mathis/Dev/rust-bert/target/debug/deps/liblazy_static-cd2401a06b1676f6.rlib --extern ordered_float=/home/mathis/Dev/rust-bert/target/debug/deps/libordered_float-9d94cc6b3ed64167.rlib --extern rust_bert=/home/mathis/Dev/rust-bert/target/debug/deps/librust_bert-2ec9c634d44292b9.rlib --extern rust_tokenizers=/home/mathis/Dev/rust-bert/target/debug/deps/librust_tokenizers-d60a711620946171.rlib --extern serde=/home/mathis/Dev/rust-bert/target/debug/deps/libserde-3f736418ac579112.rlib --extern serde_json=/home/mathis/Dev/rust-bert/target/debug/deps/libserde_json-56cb869844da73de.rlib --extern tch=/home/mathis/Dev/rust-bert/target/debug/deps/libtch-3489097ebbda946c.rlib --extern tempfile=/home/mathis/Dev/rust-bert/target/debug/deps/libtempfile-cb2a539468ebc1f7.rlib --extern thiserror=/home/mathis/Dev/rust-bert/target/debug/deps/libthiserror-849666d06ac7e440.rlib --extern torch_sys=/home/mathis/Dev/rust-bert/target/debug/deps/libtorch_sys-73690ae916a3995c.rlib --extern uuid=/home/mathis/Dev/rust-bert/target/debug/deps/libuuid-bf8e4fb0cc067046.rlib -L native=/usr/lib -L native=/home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out/libtorch/libtorch/lib -L native=/home/mathis/Dev/rust-bert/target/debug/build/torch-sys-d810dc21875d10ae/out` (exit code: 1)
    

    After taking a look at #127 I tried to downgrade to libtorch 1.7.1 but without success. I also tried to follow the "Libtorch Manual Install" documentation from the tch-rs crate.

    Best regards,

    opened by eonm-abes 9
  • Request: Make private `model` inside ConversationManager public

    Request: Make private `model` inside ConversationManager public

    I would like to discussing making the memebers of ConversationModel public. Here is why:

    What I want to be able to do is take past_user_inputs and generated_responses from the conversation and regenerate history. history is the tokenized as ids representation of the history and is the actual input into the conversation model. Whereas past_user_inputs and generated_responses are string representations for output. This would allow me to trim the inputs to say N last inputs or make other adjustments.

    Originally I tried something like this

    • prepare past_user_inputs and generated_responese that I would like to make into the context
    let texts: &[&str] = ....
    
    • Replace history using this:
    conversation.history = conversation_manager.model.get_tokenizer().convert_tokens_to_ids(
        conversation_manager.model.get_tokenizer().tokenize_list(texts.to_vec())
    )
    

    Unfortunalty I discovered that conversation_manager.model is private.

    Would there be any negative issues with making it public or perhaps added a convenience method to recreate the inputs.

    Alternatively just making encode_prompts public would suit my needs, but more general access to model would probably be more useful in the future

    opened by QuantumEntangledAndy 9
  • Permits TokenClassificationOption / DistilBertForTokenClassification to fail gracefully for an invalid configuration.

    Permits TokenClassificationOption / DistilBertForTokenClassification to fail gracefully for an invalid configuration.

    A production system shouldn't panic, if an invalid config is present. The implementation is placed in a new method, so that the change is non-breaking.

    opened by andyHa 1
  • Added DialoGPT Small and Large Models

    Added DialoGPT Small and Large Models

    In order to make the smaller and larger more easy to access I have added them. WIP: I am currently in the process of uploading the rust models to hugging face

    opened by copoer 2
  • Custom Long T5 model missing tensor values after conversion

    Custom Long T5 model missing tensor values after conversion

    Howdy,

    I am trying to convert this hugging face model to a rust model so I can use it for summarizing: https://huggingface.co/pszemraj/long-t5-tglobal-base-16384-book-summary The python conversion script works fine but when I try to the resulting model it gives me an error message saying their are missing weights. Is there a way to resolve this?

     TchError("cannot find the tensor named encoder.block.3.layer.0.SelfAttention.o.weight in rust_model.ot
    
    opened by copoer 3
  • failed to build the example

    failed to build the example

    error occurred: Command "c++" "-O0" "-ffunction-sections" "-fdata-sections" "-fPIC" "-gdwarf-4" "-fno-omit-frame-pointer" "-m64" "-I" "/opt/libtorch/include" "-I" "/opt/libtorch/include/torch/csrc/api/include" "-Wl,-rpath=/opt/libtorch/lib" "-std=c++14" "-D_GLIBCXX_USE_CXX11_ABI=1" "-o" "/home/john/Projects/conversation/target/debug/build/torch-sys-b3197540abc4eebd/out/libtch/torch_api.o" "-c" "libtch/torch_api.cpp" with args "c++" did not execute successfully (status code exit status: 1).

    opened by MaYuandong 4
  • Add MPS as default if available

    Add MPS as default if available

    Can we show some love for the Mac M1 people out there? MPS doesn't seem any harder to choose if available than CUDA, and tch-rs seems to include it in their Device enum.

    I'm okay to PR myself, if anyone can suggest the right approach. Should it simply choose MPS the same way as cuda_if_available does, and default it if available? Should we start by checking for cuda, then checking for mps, and only then defaulting to CPU?

    opened by jkoudys 3
  • [Feature Request] Joint Models

    [Feature Request] Joint Models

    Hey @guillaume-be, awesome job on this.

    I'm trying to have one model for entity recognition and text classification. There are some implementations available in the python ecosystem for this and I'm just wondering if it's possible for this project. I want to avoid the situation where I'm loading two different models in my application when it's possible with one.

    Something like this but for Rust. What would need to be done?

    opened by arctic-bunny 2
Probabilistically split concatenated words using NLP based on English Wikipedia unigram frequencies.

Untanglr Untanglr takes in a some mangled words and makes sense out of them so you dont have to. It goes through the input and splits it probabilistic

Andrei Butnaru 15 Nov 23, 2022
Implementation of sentence embeddings with BERT in Rust, using the Burn library.

Sentence Transformers in Burn This library provides an implementation of the Sentence Transformers framework for computing text representations as vec

Tyler Vergho 4 Sep 4, 2023
Rust-tokenizer offers high-performance tokenizers for modern language models, including WordPiece, Byte-Pair Encoding (BPE) and Unigram (SentencePiece) models

rust-tokenizers Rust-tokenizer offers high-performance tokenizers for modern language models, including WordPiece, Byte-Pair Encoding (BPE) and Unigra

null 165 Jan 1, 2023
Simple NLP in Rust with Python bindings

vtext NLP in Rust with Python bindings This package aims to provide a high performance toolkit for ingesting textual data for machine learning applica

Roman Yurchak 133 Jan 3, 2023
An NLP-suite powered by deep learning

DeepFrog - NLP Suite Introduction DeepFrog aims to be a (partial) successor of the Dutch-NLP suite Frog. Whereas the various NLP modules in Frog wre b

Maarten van Gompel 16 Feb 28, 2022
Simple expression transformer that is not Coq.

Noq Not Coq. Simple expression transformer that is not Coq. Quick Start $ cargo run ./examples/add.noq Main Idea The Main Idea is being able to define

Tsoding 187 Jan 7, 2023
Papercraft is a tool to unwrap 3D models.

Papercraft Introduction Papercraft is a tool to unwrap paper 3D models, so that you can cut and glue them together and get a real world paper model. T

Rodrigo Rivas Costa 13 Nov 18, 2022
A naive native 128-bit cityhash v102 implementation

Naive CityHash naive-cityhash is a naive native 128-bit cityhash v102 implementation for clickhouse*. Contact Chojan Shang - @PsiACE - psiace@outlook.

Chojan Shang 5 Apr 4, 2022
Text Expression Runner – Readable and easy to use text expressions

ter - Text Expression Runner ter is a cli to run text expressions and perform basic text operations such as filtering, ignoring and replacing on the c

Maximilian Schulke 72 Jul 31, 2022
Subtitles-rs - Use SRT subtitle files to study foreign languages

Rust subtitle utilities Are you looking for substudy? Try here. (substudy has been merged into the subtitles-rs project.) This repository contains a n

Eric Kidd 268 Dec 29, 2022
nombytes is a library that provides a wrapper for the bytes::Bytes byte container for use with nom.

NomBytes nombytes is a library that provides a wrapper for the bytes::Bytes byte container for use with nom. I originally made this so that I could ha

Alexander Krivács Schrøder 2 Jul 25, 2022
Checks all your documentation for spelling and grammar mistakes with hunspell and a nlprule based checker for grammar

cargo-spellcheck Check your spelling with hunspell and/or nlprule. Use Cases Run cargo spellcheck --fix or cargo spellcheck fix to fix all your docume

Bernhard Schuster 274 Nov 5, 2022
A backend for mdBook written in Rust for generating PDF based on headless chrome and Chrome DevTools Protocol.

A backend for mdBook written in Rust for generating PDF based on headless chrome and Chrome DevTools Protocol.

Hollow Man 52 Jan 7, 2023
Vaporetto: a fast and lightweight pointwise prediction based tokenizer

?? VAporetto: POintwise pREdicTion based TOkenizer Vaporetto is a fast and lightweight pointwise prediction based tokenizer. Overview This repository

null 184 Dec 22, 2022
🛥 Vaporetto is a fast and lightweight pointwise prediction based tokenizer. This is a Python wrapper for Vaporetto.

?? python-vaporetto ?? Vaporetto is a fast and lightweight pointwise prediction based tokenizer. This is a Python wrapper for Vaporetto. Installation

null 17 Dec 22, 2022
Neural network transition-based dependency parser (in Rust)

dpar Introduction dpar is a neural network transition-based dependency parser. The original Go version can be found in the oldgo branch. Dependencies

Daniël de Kok 41 Jan 25, 2022
A small rust library for creating regex-based lexers

A small rust library for creating regex-based lexers

nph 1 Feb 5, 2022
Simple STM32F103 based glitcher FW

Airtag glitcher (Bluepill firmware) Simple glitcher firmware running on an STM32F103 on a bluepill board. See https://github.com/pd0wm/airtag-dump for

Willem Melching 27 Dec 22, 2022
Difftastic is an experimental structured diff tool that compares files based on their syntax.

Difftastic is an experimental structured diff tool that compares files based on their syntax.

Wilfred Hughes 13.9k Jan 2, 2023