Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v2.31.6 #354

Merged
merged 13 commits into from
May 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -123,3 +123,5 @@ cypress/screenshots/
.DS_Store

.servers

*.private.*
6 changes: 4 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,12 +68,14 @@ console-search args (or env vars):

You can use the `deploy_to_servers.sh` script to deploy to several machines. You need to create `.servers` file with the following format:

```txt
```csv
user@server1,,main_user_id
user@server2,dev_user_id,
```

This way you can choose which user to connect as on which server and which Noitool instance to connect to. Note the missing entry for the dev user id on server1 and the missing entry for the main user id on server2. That means that server 1 will connect to the main instance and server 2 will connect to the dev instance without both searchers competing for CPU time.
This is a headerless csv file with the following columns: `ssh,main_user_id,dev_user_id`.

Note the missing entry for the dev user id on server1 and the missing entry for the main user id on server2. That means that server 1 will connect to the main instance and server 2 will connect to the dev instance without both searchers competing for CPU time.

Then run `./deploy_to_servers.sh` to deploy to all servers.

Expand Down
2 changes: 1 addition & 1 deletion dataScripts/full_parse.sh
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,9 @@ echo "Done!"
read -p "Do you want to copy the files to the data directory? (y/N) " -n 1 -r

if [[ $REPLY =~ ^[Yy]$ ]]; then
mv ./src/out/locales ../public/
rm -r ./src/out/gen # Not needed - need manual updates since these are templates
mv -f ./src/out/spells.h ../src/services/SeedInfo/noita_random/src/spells.h
mv ./src/out/locales ../public/

cp -rf ./src/out/* ../src/services/SeedInfo/data/
fi
Expand Down
313 changes: 162 additions & 151 deletions package-lock.json

Large diffs are not rendered by default.

4 changes: 2 additions & 2 deletions package.json
100755 → 100644
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "noitool",
"version": "2.31.3",
"version": "2.31.6",
"private": true,
"type": "module",
"homepage": "https://www.noitool.com/",
Expand All @@ -11,7 +11,7 @@
"backblaze-b2": "^1.7.0",
"body-parser": "^1.20.2",
"cookie-parser": "^1.4.6",
"discord.js": "^14.14.1",
"discord.js": "^14.15.2",
"dotenv-flow": "^4.1.0",
"express": "^4.19.2",
"express-rate-limit": "^7.2.0",
Expand Down
4 changes: 0 additions & 4 deletions server/io/compute.mjs
Original file line number Diff line number Diff line change
@@ -1,8 +1,4 @@
import { schedule } from "node-cron";

import B2 from "backblaze-b2";
B2.prototype.uploadAny = await import("@gideo-llc/backblaze-b2-upload-any");

import { User } from "../db.mjs";

export const counts = {
Expand Down
49 changes: 29 additions & 20 deletions server/server.mjs
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ import RateLimit from "express-rate-limit";

import B2 from "backblaze-b2";
import { randomUUID } from "crypto";
B2.prototype.uploadAny = await import("@gideo-llc/backblaze-b2-upload-any");

import { createServer } from "http";

Expand Down Expand Up @@ -92,25 +91,36 @@ if (hasB2) {
setInterval(authorize, 1000 * 60 * 60 * 23); // 23h
}

const uploadToB2 = async (data, bucketId, fileName) => {
if (!hasB2) {
return;
}
const uploadUrl = await b2.getUploadUrl({
bucketId,
});

const upload = await b2.uploadFile({
uploadUrl,
uploadAuthToken: uploadUrl.data.authorizationToken,
fileName,
data,
});

return upload;
};

const m = multer();
app.post("/api/db_debug/", m.any(), async (req, res) => {
const id = randomUUID();
res.send({ id });
try {
await b2.uploadAny({
bucketId: "e3081aa3bc7d39b38a1d0615",
fileName: `${id}.db`,
partSize: r.data.recommendedPartSize,
data: req.files[0].buffer,
});
} catch (e) {
console.error(e);
}

const upload = await uploadToB2(req.files[0].buffer, "93c80a630c6d59a37add0615", `${id}.db`);
console.log(upload.data);
});

app.get("/m/*", async (req, res) => {
const m = req.params[0];
console.log(m);
console.log(JSON.stringify(m));
res.append("Cache-Control", "immutable, max-age=360");
res.send({});
});
Expand Down Expand Up @@ -170,25 +180,24 @@ server.listen(PORT, () => {
console.log(`Running at http://localhost:${PORT}`);
});

const upload = async () => {
const uploadStats = async () => {
if (!hasB2) {
return;
}
try {
await b2.uploadAny({
bucketId: "93c80a630c6d59a37add0615",
fileName: `${new Date().toISOString()}.json`,
partSize: r.data.recommendedPartSize,
data: Buffer.from(JSON.stringify({ data, stats })),
});
const upload = uploadToB2(
Buffer.from(JSON.stringify({ data, stats })),
"93c80a630c6d59a37add0615",
`${new Date().toISOString()}.json`,
);
data = [];
stats = [];
} catch (e) {
console.error(e);
}
};

schedule("0 0 * * *", upload);
schedule("0 0 * * *", uploadStats);

const shutdown = signal => err => {
if (err) console.error(err.stack || err);
Expand Down
2 changes: 1 addition & 1 deletion src/components/App.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ const Header = () => {
bottom: "-0.25rem",
}}
>
Build April 5 2024
Build April 30 2024
</code>
)}
{isDevBranch && <div />}
Expand Down
19 changes: 17 additions & 2 deletions src/services/SeedInfo/data/materials.json
Original file line number Diff line number Diff line change
Expand Up @@ -1946,6 +1946,7 @@
"color": "f7f3eeff",
"name": "orb_powder",
"hp": "1000",
"tags": ["NO_FUNGAL_SHIFT"],
"burnable": "1",
"density": "7",
"ui_name": "$mat_orb_powder",
Expand Down Expand Up @@ -1978,6 +1979,13 @@
"density": "6",
"ui_name": "$mat_plastic_red"
},
"plastic_grey": {
"wang_color": "ff707271",
"color": "707271ff",
"name": "plastic_grey",
"tags": ["corrodible", "meltable", "alchemy"],
"ui_name": "$mat_plastic_red"
},
"plastic_red_molten": {
"wang_color": "ff9000ab",
"color": "9000abff",
Expand All @@ -1988,6 +1996,13 @@
"density": "5.1",
"ui_name": "$mat_plastic_red_molten"
},
"plastic_grey_molten": {
"wang_color": "ff707273",
"color": "707273ff",
"name": "plastic_grey_molten",
"tags": ["liquid", "corrodible", "molten", "meltable_to_lava", "alchemy", "liquid_common"],
"ui_name": "$mat_plastic_red_molten"
},
"grass": {
"wang_color": "ff3abb30",
"color": "3abb30ff",
Expand Down Expand Up @@ -2672,7 +2687,7 @@
"color": "3b5d3bff",
"name": "plastic",
"hp": "4000",
"tags": ["box2d", "corrodible", "meltable", "alchemy", "solid", "earth"],
"tags": ["box2d", "corrodible", "meltable_plastic", "alchemy", "solid", "earth"],
"burnable": "0",
"density": "8",
"ui_name": "$mat_plastic",
Expand Down Expand Up @@ -5777,7 +5792,7 @@
"color": "3b6d4bff",
"name": "plastic_prop",
"hp": "4000",
"tags": ["box2d", "corrodible", "meltable", "alchemy", "solid", "earth"],
"tags": ["box2d", "corrodible", "meltable_plastic", "alchemy", "solid", "earth"],
"burnable": "0",
"density": "8",
"ui_name": "$mat_plastic",
Expand Down
13,781 changes: 6,714 additions & 7,067 deletions src/services/SeedInfo/data/obj/biome_impl.json

Large diffs are not rendered by default.

Loading
Loading