Skip to content

Commit

Permalink
Create issue templates
Browse files Browse the repository at this point in the history
  • Loading branch information
ikergarcia1996 authored Oct 9, 2023
1 parent bd7090b commit 1b154a5
Show file tree
Hide file tree
Showing 4 changed files with 106 additions and 0 deletions.
12 changes: 12 additions & 0 deletions .github/ISSUE_TEMPLATE/✍️-other-issue.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
---
name: "✍️ Other Issue"
about: For issues not covered by the other templates.
title: ''
labels: ''
assignees: ''

---

Please proviede a concise description of your issue. Proviede as much information as possible, including, if required, your system hardware, software version... and any other info that could be usefull to find a solution.

Please, If you want to report a bug, or any error you found while running the code, use the "Bug report and Issues" Template.
56 changes: 56 additions & 0 deletions .github/ISSUE_TEMPLATE/🐛-bug-report-and-issues.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
---
name: "\U0001F41B Bug report and Issues"
about: Submit a bug report to help us improve GoLLIE
title: "[BUG] Bug or issue report"
labels: ''
assignees: ''

---

**Describe the task**
1. Model: Which GoLLIE model are you attemping to run?
2. Task: Which task are you attemping to run (training, evaluation, generate the dataset,...)?

**Describe the bug**
A clear and concise description of what the bug is. You can add the error traceback or screenshots here.

**To Reproduce**
Steps to reproduce the behavior:
1. Load model X
```Python
model, tokenizer = load_model(
inference=True,
model_weights_name_or_path="HiTZ/GoLLIE-7B",
quantization=None,
use_lora=False,
force_auto_device_map=True,
use_flash_attention=True,
torch_dtype="bfloat16"
)
```
2. Run X function
```Python
model_ouput = model.generate(
**model_input.to(model.device),
max_new_tokens=128,
do_sample=False,
min_new_tokens=0,
num_beams=1,
num_return_sequences=1,
)
```
3. Any other step required to reproduce the behaviour

**Expected behavior**
A clear and concise description of what you expected to happen.

**System Info**
1. GPU: (i.e Nvidia A100)
2. Pytorch version:
3. Transformers version:
4. Model configuration: Are you using 4 / 8 bits quantization? Are you using mGPU? etc..
5. Any other relevant information:


**Additional context**
Add any other context about the problem here.
20 changes: 20 additions & 0 deletions .github/ISSUE_TEMPLATE/📑-new-task-dataset-proposals.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
---
name: "\U0001F4D1 New Task/Dataset Proposals"
about: Submit a proposal/request for a new task support in GoLLIE
title: "[TASK] I want task X to be supported by GoLLIE"
labels: ''
assignees: ''

---

**Describe the task you'd like to implement**
A clear and concise description of the task/dataset. If you want to propose a new dataset, please explain if the task is already supported by GoLLIE (i.e Named Entity Recognition, Event Extraction, Relation Extraction...). If it is not, explain how you would implement it.

**Data**
Is the data for the task publicly available? If it is, provide a link to it. If it is not, explain how the data can be obtained.

**Guidelines**
Are guidelines available? If guidelines are available, provide a link to them. If they are not, describe how you would generate them.

**Your contribution**
Is there any way that you could help, e.g. by submitting a PR?
18 changes: 18 additions & 0 deletions .github/ISSUE_TEMPLATE/🚀-feature-request.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
---
name: "\U0001F680 Feature request"
about: Submit a proposal/request for a new transformers feature
title: "[FEATURE] I want feature X supported by GoLLIE"
labels: ''
assignees: ''

---

**Feature request**
A clear and concise description of the feature proposal.

**Motivation**
Please outline the motivation for the proposal. Is your feature request related to a problem? e.g., I'm always frustrated when [...]. If this is related to another GitHub issue, please link here too.


**Your contribution**
Is there any way that you could help, e.g. by submitting a PR?

0 comments on commit 1b154a5

Please sign in to comment.