Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

issue "Preparing submission... this may take a few moments.." never run just in submitting stage #1708

Open
SondosBsharat opened this issue Dec 18, 2024 · 26 comments
Labels
Competition-specific Problem specific to a given competition or benchmark

Comments

@SondosBsharat
Copy link

how to solve this issue?
image
it keeps on submitting status and never changes ,even the log file is empty
image

@ObadaS
Copy link
Collaborator

ObadaS commented Dec 18, 2024

Hello, can you post or send us ([email protected]) the link of the competition ?

@SondosBsharat
Copy link
Author

I have not published it yet , I am testing it before publishing it

@ObadaS
Copy link
Collaborator

ObadaS commented Dec 18, 2024

I see that the default queue is behaving correctly on our end, other competitions can run submissions without problems. I also ran a submission on my own test competition and everything worked correctly.

Maybe the problem comes from the way your competition is configured ? Are you using the default docker image for competition or a custom one ?

@SondosBsharat
Copy link
Author

SondosBsharat commented Dec 18, 2024

What is the default one? Actually, this is what i wrote, but it should be the default one for the queueimage
Because my competition, the participant should only submit a CSV file with a result and based on the scoring file and reference data, will be evaluated

@ObadaS
Copy link
Collaborator

ObadaS commented Dec 18, 2024

I don't know if it will fix your problem, but you can try to use codalab/codalab-legacy:py39 instead of python:3.8

@SondosBsharat
Copy link
Author

It still same issue , there are no logs at all and the status is submitting

@ObadaS
Copy link
Collaborator

ObadaS commented Dec 18, 2024

Can you add me as an organizer ? I'll try to do some submissions to see if mine works.
My username is obada

@SondosBsharat
Copy link
Author

SondosBsharat commented Dec 18, 2024

sure , is it working with you ?

@ObadaS
Copy link
Collaborator

ObadaS commented Dec 18, 2024

I was able to submit something which successfully got past the Submitted status on the MM competition.

I added you to one of my test competition (called Iris), try to submit something there please.

@SondosBsharat
Copy link
Author

SondosBsharat commented Dec 18, 2024

I did and it works . Maybe there is an issue with the paths in my scoring.py file ?

@ObadaS
Copy link
Collaborator

ObadaS commented Dec 18, 2024

I ran your predictions.zip and it also got past the Submitted status. This looks like a stranger issue...
What browser are you using ?

@SondosBsharat
Copy link
Author

SondosBsharat commented Dec 18, 2024

chrome.. Is it evaluated ? i mean you can move it to leaderboard ?

@ObadaS
Copy link
Collaborator

ObadaS commented Dec 18, 2024

I tried submitting on Chrome, and it also worked fine. Maybe you can try using Firefox, but it shouldn't change much since you can submit on my Iris test competition...

I can't add it to the leaderboard because the submission Failed (time limit exceeded), but it did finish.
As Organizer, you can check the status of all submissions on your competition by clicking on the Submission button under the competition title:
image
You will my submissions attempts and status, as well as have some actions (like putting them on the leaderboard, rerunning it etc...)

@SondosBsharat
Copy link
Author

I reran yours again, but it is still the same. Anyway, if others are able to run it, then no problem. I have increased the time; can you resubmit to ensure the scoring file will run? Because up until this point, it has not run yet

@ObadaS
Copy link
Collaborator

ObadaS commented Dec 18, 2024

I submitted it again.
I see that you wrote about the scoring.py earlier. Is that supposed to be in the predictions.zip file ? The one I downloaded only contains a .csv file inside the .zip.

@SondosBsharat
Copy link
Author

No, I mean the 'scoring program' to evaluate the submission against the ground truth

@SondosBsharat
Copy link
Author

is there a specific format for that file? can you double check it ?

@ObadaS
Copy link
Collaborator

ObadaS commented Dec 18, 2024

You can find examples here : https://github.com/codalab/competition-examples/tree/master/codabench
The wiki also has some guides to help you out : https://github.com/codalab/codabench/wiki#2-organizers

I will bring your issue with the team on our next meeting, hopefully I will have more information for you afterward.

Concerning your current file, the worker log might give you more information as to what is missing :

[2024-12-18 20:57:34,277: INFO/ForkPoolWorker-79] Running scoring program, and then ingestion program
[2024-12-18 20:57:34,278: INFO/ForkPoolWorker-79] Program directory missing metadata, assuming it's going to be handled by ingestion program so move it to output
[2024-12-18 20:57:34,278: INFO/ForkPoolWorker-79] /codabench/tmpn8c5lom6/ingestion_program not found, no program to execute
[2024-12-18 20:57:34,279: INFO/ForkPoolWorker-79] 9.5367431640625e-07
[2024-12-18 20:57:39,284: INFO/ForkPoolWorker-79] 5.005452394485474
[2024-12-18 20:57:44,289: INFO/ForkPoolWorker-79] 10.01093602180481
[2024-12-18 20:57:49,295: INFO/ForkPoolWorker-79] 15.016502618789673
[2024-12-18 20:57:54,301: INFO/ForkPoolWorker-79] 20.02216148376465
[2024-12-18 20:57:59,306: INFO/ForkPoolWorker-79] 25.027804851531982
[2024-12-18 20:58:04,312: INFO/ForkPoolWorker-79] 30.033443450927734
[2024-12-18 20:58:09,318: INFO/ForkPoolWorker-79] 35.03897023200989
[2024-12-18 20:58:14,323: INFO/ForkPoolWorker-79] 40.04458689689636
[2024-12-18 20:58:19,329: INFO/ForkPoolWorker-79] 45.05028676986694
[2024-12-18 20:58:24,334: INFO/ForkPoolWorker-79] 50.055930376052856
[2024-12-18 20:58:29,340: INFO/ForkPoolWorker-79] 55.06159567832947
[2024-12-18 20:58:34,346: INFO/ForkPoolWorker-79] 60.067262172698975
[2024-12-18 20:58:34,346: WARNING/ForkPoolWorker-79] Detailed results not written to after 60 seconds, exiting!
[2024-12-18 20:58:39,351: INFO/ForkPoolWorker-79] 65.07292580604553

It tries to write the results for 20 minutes then fails since we have a 20 minutes execution limit on the default queue.

@SondosBsharat
Copy link
Author

I tried to fix them but i am having this issue now .Is there any constraints to name the files ? or does the submitted file pass with its name ?
image

@SondosBsharat
Copy link
Author

It seems that the submitted file does not trigger.
image

@ObadaS
Copy link
Collaborator

ObadaS commented Dec 19, 2024

It seems like the problem comes from your scoring program.

I suggest taking a look at the Iris scoring program : https://github.com/codalab/competition-examples/blob/master/codabench/iris/bundle/scoring_program/score.py

You can submit results for this competition (example : https://github.com/codalab/competition-examples/blob/master/codabench/iris/sample_result_submission.zip) , which is the same thing you are trying to do.

Maybe the path in which your scoring program is trying the find the data is wrong ?

@SondosBsharat
Copy link
Author

yes this is the issue. How should I deal with paths? like the submission file after it submitted how to refer to it in my [ingestion.py] file ? and after that how to pass the result to the scoring .. I tried to follow the examples but it is sill the same issue

@Didayolo Didayolo added Bug Competition-specific Problem specific to a given competition or benchmark labels Dec 19, 2024
@Didayolo
Copy link
Member

@SondosBsharat

Did you change path in the scoring and/or ingestion programs?
Please check the examples pointed earlier in the discussion and see if your path are different (inside scoring.py, metadata, etc.)

Another possibility is that the problem comes from zipping with directory structure.
Basically, when zipping submissions, programs, or data, you should zip the files directly, without directory.
Can you try this?

You can update your programs following this Wiki page: https://github.com/codalab/codabench/wiki/Resource-Management

@SondosBsharat
Copy link
Author

SondosBsharat commented Dec 19, 2024

These are the links in my scoring file. Are they correct? Previously, I included an ingestion file, but then I removed it since I only need the participants to submit their prediction results. However, after removing it, the submission stays in the running state until the time exceeds, or it continues running if I increase the time limit.:
**Also based on instructions i removed the metadata

f name == "main":
try:
# Set up directories
input_dir = sys.argv[1]
output_dir = sys.argv[2]
reference_dir = os.path.join(input_dir, "ref")
prediction_dir = os.path.join(input_dir, "res")

    prediction_file = os.path.join(prediction_dir, "predictions.csv")
    ground_truth_file = os.path.join(reference_dir, "ground_truth.csv")

    # Logging setup
    logging.basicConfig(
        filename=os.path.join(output_dir, "scoring.log"),
        level=logging.INFO,
        format='%(asctime)s - %(levelname)s - %(message)s'
    )

    # Validate files
    if not os.path.exists(prediction_file):
        raise FileNotFoundError(f"Prediction file not found: {prediction_file}")
    if not os.path.exists(ground_truth_file):
        raise FileNotFoundError(f"Ground truth file not found: {ground_truth_file}")

    # Evaluate predictions
    results = evaluate_predictions(prediction_file, ground_truth_file)

    # Write results
    write_results(results, output_dir)

except Exception as e:
    error_message = f"Error occurred: {str(e)}"
    logging.exception(error_message)
    with open(os.path.join(output_dir, "error.log"), 'w') as error_file:
        error_file.write(error_message)

@SondosBsharat
Copy link
Author

It seems there is an issue because even if I try to include a print statement at the top of the scoring file, it does not print anything, so it is not being accessed

@Didayolo
Copy link
Member

@SondosBsharat Any update on this? Did you manage to solve the issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Competition-specific Problem specific to a given competition or benchmark
Projects
None yet
Development

No branches or pull requests

3 participants