Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor: migrate to batching toposort for execution of releases #46

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 2 additions & 24 deletions lib/createInlinePluginCreator.js
Original file line number Diff line number Diff line change
Expand Up @@ -12,16 +12,14 @@ const { get } = require("lodash");
*
* @param {Package[]} packages The multi-semantic-release context.
* @param {MultiContext} multiContext The multi-semantic-release context.
* @param {Synchronizer} synchronizer Shared synchronization assets
* @param {Object} flags argv options
* @returns {Function} A function that creates an inline package.
*
* @internal
*/
function createInlinePluginCreator(packages, multiContext, synchronizer, flags) {
function createInlinePluginCreator(packages, multiContext, flags) {
// Vars.
const { cwd } = multiContext;
const { todo, waitFor, waitForAll, emit, getLucky } = synchronizer;

/**
* Update pkg deps.
Expand Down Expand Up @@ -92,10 +90,6 @@ function createInlinePluginCreator(packages, multiContext, synchronizer, flags)
Object.assign(pkg.loggerRef, context.logger);

pkg._ready = true;
emit(
"_readyForRelease",
todo().find((p) => !p._ready)
);

const res = await plugins.verifyConditions(context);

Expand Down Expand Up @@ -138,7 +132,6 @@ function createInlinePluginCreator(packages, multiContext, synchronizer, flags)

// Wait until all todo packages have been analyzed.
pkg._analyzed = true;
await waitForAll("_analyzed");

// Make sure type is "patch" if the package has any deps that have changed.
if (!pkg._nextType && hasChangedDeep(pkg._localDeps)) pkg._nextType = "patch";
Expand Down Expand Up @@ -180,13 +173,6 @@ function createInlinePluginCreator(packages, multiContext, synchronizer, flags)
// Set nextRelease for package.
pkg._nextRelease = context.nextRelease;

// Wait until all todo packages are ready to generate notes.
await waitForAll("_nextRelease", (p) => p._nextType);

// Wait until the current pkg is ready to generate notes
getLucky("_readyToGenerateNotes", pkg);
await waitFor("_readyToGenerateNotes", pkg);

// Update pkg deps.
updateManifestDeps(pkg, path);
pkg._depsUpdated = true;
Expand Down Expand Up @@ -215,20 +201,12 @@ function createInlinePluginCreator(packages, multiContext, synchronizer, flags)
debug("notes generated: %s", pkg.name);

// Return the notes.
return notes.join("\n\n");
return notes.join("\n").replace("\n\n", "\n").trim();
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This may not be needed, but I was noticing that our release notes contained a lot of unnecessary new lines

};

const publish = async (pluginOptions, context) => {
pkg._prepared = true;

emit(
"_readyToGenerateNotes",
todo().find((p) => p._nextType && !p._prepared)
);

// Wait for all packages to be `prepare`d and tagged by `semantic-release`
await waitForAll("_prepared", (p) => p._nextType);

const res = await plugins.publish(context);

debug("published: %s", pkg.name);
Expand Down
89 changes: 0 additions & 89 deletions lib/getSynchronizer.js

This file was deleted.

2 changes: 1 addition & 1 deletion lib/hasChangedDeep.js
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ function hasChangedDeep(packages, ignore = []) {
// 1. Any local dep package itself has changed
if (p._nextType) return true;
// 2. Any local dep package has local deps that have changed.
else if (hasChangedDeep(p._localDeps, [...ignore, ...packages])) return true;
else if (p._localDeps && hasChangedDeep(p._localDeps, [...ignore, ...packages])) return true;
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I came across this bug originally when doing a spike into why our API usage of multi-semantic-release didn't work, essentially, p._localDeps can be undefined, as far as I can tell, so this check prevents recursion which would result in an error.

Copy link
Collaborator

@antongolub antongolub Nov 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch. I've moved localDeps resolver one level up. 48bcad7

pkg.localDeps = pkg.deps.map((d) => packages.find((p) => d === p.name)).filter(Boolean);

// Nope.
else return false;
});
Expand Down
136 changes: 107 additions & 29 deletions lib/multiSemanticRelease.js
Original file line number Diff line number Diff line change
@@ -1,14 +1,15 @@
const { dirname } = require("path");
const semanticRelease = require("semantic-release");
const batchingToposort = require("batching-toposort");
const { check } = require("./blork");
const getLogger = require("./getLogger");
const getSynchronizer = require("./getSynchronizer");
const getConfig = require("./getConfig");
const getConfigSemantic = require("./getConfigSemantic");
const getManifest = require("./getManifest");
const cleanPath = require("./cleanPath");
const RescopedStream = require("./RescopedStream");
const createInlinePluginCreator = require("./createInlinePluginCreator");
const stream = require("stream");

/**
* The multirelease context.
Expand Down Expand Up @@ -67,33 +68,91 @@ async function multiSemanticRelease(
const multiContext = { options, cwd, env, stdout, stderr };

// Load packages from paths.
const packages = await Promise.all(paths.map((path) => getPackage(path, multiContext)));
packages.forEach((pkg) => logger.success(`Loaded package ${pkg.name}`));
logger.complete(`Queued ${packages.length} packages! Starting release...`);

// Shared signal bus.
const synchronizer = getSynchronizer(packages);
const { getLucky, waitFor } = synchronizer;

// Release all packages.
const createInlinePlugin = createInlinePluginCreator(packages, multiContext, synchronizer, flags);
await Promise.all(
packages.map(async (pkg) => {
// Avoid hypothetical concurrent initialization collisions / throttling issues.
// https://github.com/dhoulb/multi-semantic-release/issues/24
if (flags.sequentialInit) {
getLucky("_readyForRelease", pkg);
await waitFor("_readyForRelease", pkg);
const packages = await Promise.all(
paths.map((path) =>
getPackage(path, multiContext).then((pkg) => {
logger.success(`Loaded package ${pkg.name}`);
return pkg;
})
)
);

logger.complete(`Collected ${packages.length} packages! Starting release...`);

// --- Toposort
const packageNames = packages.map((pkg) => pkg.name);
const packageDag = {};

// create the dag roots for each package:
packages.forEach((pkg) => (packageDag[pkg.name] = []));
// populate the roots with that packages dependencies on other packages:
packages.forEach((pkg) => {
pkg.deps.forEach((dep) => {
if (packageNames.includes(dep)) {
packageDag[dep].push(pkg.name);
}
});
});

// Prepare the batches for releasing:
const taskBatches = batchingToposort(packageDag);

logger.complete(`Calculated ${taskBatches.length} batches of releases for optimal releasing power!`);

const createInlinePlugin = createInlinePluginCreator(packages, multiContext, flags);

const releaseTrain = new Promise((resolve, reject) => {
Promise.resolve().then(async () => {
for (const batch of taskBatches) {
logger.log(
`Starting batch #${taskBatches.indexOf(batch) + 1} containing ${batch.length} package${
batch.length > 1 ? "s" : ""
}`
);

await Promise.all(
batch.map((packageName) => {
const pkg = packages.find((_pkg) => _pkg.name === packageName);

// We probably don't really need to create separated streams, but
// could be useful if we want to give a per-release log file or
// something similar:
pkg.stdout = stream.PassThrough();
pkg.stderr = stream.PassThrough();

pkg.stderr.pipe(process.stderr);
pkg.stdout.pipe(process.stdout);

// Start the release of the package: the idea of the catch/reject here
// is to abort the entire release process if we fail to release a
// package, but this may not actually work:
releasePackage(pkg, createInlinePlugin, multiContext).catch(reject);

return pkg.result;
})
);
}

return releasePackage(pkg, createInlinePlugin, multiContext);
resolve(packages);
});
});

return releaseTrain
.then(async (results) => {
return Promise.all(
results.map(async (pkg) => {
pkg.result = await pkg.result;
return pkg;
})
);
})
);
const released = packages.filter((pkg) => pkg.result).length;
.then((results) => {
const released = results.filter((pkg) => pkg.status === "success").length;

// Return packages list.
logger.complete(`Released ${released} of ${packages.length} packages, semantically!`);
return packages;
// Return packages list.
logger.complete(`Released ${released} of ${results.length} packages, semantically!`);
return results;
});
}

// Exports.
Expand Down Expand Up @@ -157,7 +216,7 @@ async function getPackage(path, { options: globalOptions, env, cwd, stdout, stde
async function releasePackage(pkg, createInlinePlugin, multiContext) {
// Vars.
const { options: pkgOptions, name, dir } = pkg;
const { env, stdout, stderr } = multiContext;
const { env } = multiContext;

// Make an 'inline plugin' for this package.
// The inline plugin is the only plugin we call semanticRelease() with.
Expand All @@ -180,10 +239,29 @@ async function releasePackage(pkg, createInlinePlugin, multiContext) {

// Call semanticRelease() on the directory and save result to pkg.
// Don't need to log out errors as semantic-release already does that.
pkg.result = await semanticRelease(options, {
pkg.result = semanticRelease(options, {
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

by not awaiting here, we allow parallel releases of packages within a given batch; this does mean an interleaved log at the moment, but does mean that you should have faster releases.

cwd: dir,
env,
stdout: new RescopedStream(stdout, name),
stderr: new RescopedStream(stderr, name),
});
stdout: new RescopedStream(pkg.stdout, name),
stderr: new RescopedStream(pkg.stderr, name),
}).then(
(result) => {
console.log({
lastRelease: result.lastRelease,
nextRelease: result.nextRelease,
releases: result.releases,
});

pkg.isPending = false;
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

isPending was added when I was modifying the original algorithm of the synchronizer, so can safely be ignored now — I was using this to determine the todo()'s instead of the presence or absence of the pkg.result property

pkg.status = "success";
return result;
},
(error) => {
pkg.isPending = false;
pkg.status = "failure";
throw error;
}
);

return pkg;
}
1 change: 1 addition & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,7 @@
},
"dependencies": {
"bash-glob": "^2.0.0",
"batching-toposort": "^1.2.0",
"blork": "^9.2.2",
"cosmiconfig": "^7.0.0",
"detect-indent": "^6.0.0",
Expand Down
5 changes: 5 additions & 0 deletions yarn.lock
Original file line number Diff line number Diff line change
Expand Up @@ -1364,6 +1364,11 @@ bash-path@^1.0.1:
arr-union "^3.1.0"
is-windows "^1.0.1"

batching-toposort@^1.2.0:
version "1.2.0"
resolved "https://registry.yarnpkg.com/batching-toposort/-/batching-toposort-1.2.0.tgz#3a40e7c31c2e27b950a03ea630d0b8aace4c9821"
integrity sha512-HDf0OOv00dqYGm+M5tJ121RTzX0sK9fxzBMKXYsuQrY0pKSOJjc5qa0DUtzvCGkgIVf1YON2G1e/MHEdHXVaRQ==

bcrypt-pbkdf@^1.0.0:
version "1.0.2"
resolved "https://registry.yarnpkg.com/bcrypt-pbkdf/-/bcrypt-pbkdf-1.0.2.tgz#a4301d389b6a43f9b67ff3ca11a3f6637e360e9e"
Expand Down