You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So I'm trying to de-obfuscate a file that has 3781204 characters. I know its quite a bit...
I decided to go with Gemini due to the disclaimer of speed with OpenAI. Firstly the progress is incredibly slow (probably due to the file size). Anyways, after running this command and two hours (getting to 3% completion)
humanify gemini --apiKey="someKey" original.js
I get this error:
Processing file 1/1
Processing: 2%file:///C:/Users/user/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/@google/generative-ai/dist/index.mjs:401
throw new GoogleGenerativeAIFetchError( Error fetching from ${url.toString()}: [${response.status} ${response.statusText}] ${message} , response.status, response.statusText, errorDetails);
^
GoogleGenerativeAIFetchError: [GoogleGenerativeAI Error]: Error fetching from https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-flash:generateContent: [503 Service Unavailable] The model is overloaded. Please try again later.
at handleResponseNotOk (file:///C:/Users/user/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/@google/generative-ai/dist/index.mjs:401:11)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async makeRequest (file:///C:/Users/user/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/@google/generative-ai/dist/index.mjs:374:9)
at async generateContent (file:///C:/Users/user/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/@google/generative-ai/dist/index.mjs:817:22)
at async file:///C:/Users/user/AppData/Roaming/npm/node_modules/humanifyjs/dist/index.mjs:56818:24
at async visitAllIdentifiers (file:///C:/Users/user/AppData/Roaming/npm/node_modules/humanifyjs/dist/index.mjs:56604:21)
at async file:///C:/Users/user/AppData/Roaming/npm/node_modules/humanifyjs/dist/index.mjs:56810:12
at async unminify (file:///C:/Users/user/AppData/Roaming/npm/node_modules/humanifyjs/dist/index.mjs:202:27)
at async Command.<anonymous> (file:///C:/Users/user/AppData/Roaming/npm/node_modules/humanifyjs/dist/index.mjs:56859:3) {
status: 503,
statusText: 'Service Unavailable',
errorDetails: undefined
}
Node.js v18.20.4
Now from what I understand this is completely google's issue as stated in this thread. The top answer does suggest implementing a "back-off" feature, similar to rate-limit prevention techniques. Is it possible to do this? or is there another cause to the issue to do with my machine?
tysm
The text was updated successfully, but these errors were encountered:
The top answer does suggest implementing a "back-off" feature, similar to rate-limit prevention techniques. Is it possible to do this? or is there another cause to the issue to do with my machine?
@Bruno-Alumn Based on a quick skim of the error + StackOverflow page, I think you're right in thinking this is more of a problem with Google + humanify not having a good 'error recovery mechanism' rather than anything specific with your machine.
Thank you very much for the reply @0xdevalias! Ill look into contributing to the project to solve the issue though it seems to have already been mentioned in the thread you linked. Either way, for my current situation, I decided to use OpenAI. It is true, the speed it noticeably slower, though i still had one question:
Does the following seem like an accurate ratio between input and output tokens?
So I'm trying to de-obfuscate a file that has 3781204 characters. I know its quite a bit...
I decided to go with Gemini due to the disclaimer of speed with OpenAI. Firstly the progress is incredibly slow (probably due to the file size). Anyways, after running this command and two hours (getting to 3% completion)
I get this error:
Now from what I understand this is completely google's issue as stated in this thread. The top answer does suggest implementing a "back-off" feature, similar to rate-limit prevention techniques. Is it possible to do this? or is there another cause to the issue to do with my machine?
tysm
The text was updated successfully, but these errors were encountered: