Skip to content

Commit

Permalink
🐓
Browse files Browse the repository at this point in the history
  • Loading branch information
manucorporat committed May 5, 2023
0 parents commit c21f60e
Show file tree
Hide file tree
Showing 598 changed files with 40,131 additions and 0 deletions.
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
dist
server
node_modules
133 changes: 133 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,133 @@
# Qwik Docs Site ⚡️

## Development Builds

### Client only

During development, the index.html is not a result of server-side rendering, but rather the Qwik app is built using client-side JavaScript only. This is ideal for development with Vite and its ability to reload modules quickly and on-demand. However, this mode is only for development and does not showcase "how" Qwik works since JavaScript is required to execute, and Vite imports many development modules for the app to work.

```
pnpm dev
```

### Server-side Rendering (SSR) and Client

Server-side rendered index.html, with client-side modules prefetched and loaded by the browser. This can be used to test out server-side rendered content during development, but will be slower than the client-only development builds.

```
pnpm dev.ssr
```

## Production Builds

A production build should generate the client and server modules by running both client and server build commands.

```
pnpm build
```

### Client Modules

Production build that creates only the client-side modules that are dynamically imported by the browser.

```
pnpm build.client
```

### Server Modules

Production build that creates the server-side render (SSR) module that is used by the server to render the HTML.

```
pnpm build.ssr
```

## Cloudflare Pages

Cloudflare's [wrangler](https://github.com/cloudflare/wrangler) CLI can be used to preview a production build locally. To start a local server, run:

```
pnpm serve
```

Then visit [http://localhost:8787/](http://localhost:8787/)

### Deployments

[Cloudflare Pages](https://pages.cloudflare.com/) are deployable through their [Git provider integrations](https://developers.cloudflare.com/pages/platform/git-integration/).

If you don't already have an account, then [create a Cloudflare account here](https://dash.cloudflare.com/sign-up/pages). Next go to your dashboard and follow the [Cloudflare Pages deployment guide](https://developers.cloudflare.com/pages/framework-guides/deploy-anything/).

Within the projects "Settings" for "Build and deployments", the "Build command" should be `pnpm build`, and the "Build output directory" should be set to `dist`.

## Algolia search

STILL WIP

resource: https://docsearch.algolia.com/

### Crawler

Setup in https://crawler.algolia.com/

### Debug local site with crawler settings

To crawl localhost site for testing index settings for content hierarchy. use this docker command

```shell
# create apiKey via https://www.algolia.com/account/api-keys
touch .env
# APPLICATION_ID=APPLICATION_ID
# API_KEY=API_KEY
docker run -it --rm --env-file=.env -e "CONFIG=$(cat ./packages/docs/algolia.json | jq -r tostring)" algolia/docsearch-scraper
```

see guide of [DocSearch-legacy docker command](https://docsearch.algolia.com/docs/legacy/run-your-own#run-the-crawl-from-the-docker-image)

> In mac machine, docker container can access host's network, workaround is to use `host.docker.internal`
## Cloudflare Pages

Cloudflare's [wrangler](https://github.com/cloudflare/wrangler) CLI can be used to preview a production build locally. To start a local server, run:

```
pnpm serve
```

Then visit [http://localhost:8787/](http://localhost:8787/)

### Deployments

[Cloudflare Pages](https://pages.cloudflare.com/) are deployable through their [Git provider integrations](https://developers.cloudflare.com/pages/platform/git-integration/).

If you don't already have an account, then [create a Cloudflare account here](https://dash.cloudflare.com/sign-up/pages). Next go to your dashboard and follow the [Cloudflare Pages deployment guide](https://developers.cloudflare.com/pages/framework-guides/deploy-anything/).

Within the projects "Settings" for "Build and deployments", the "Build command" should be `pnpm build`, and the "Build output directory" should be set to `dist`.

### Function Invocation Routes

Cloudflare Page's [function-invocation-routes config](https://developers.cloudflare.com/pages/platform/functions/routing/#functions-invocation-routes) can be used to include, or exclude, certain paths to be used by the worker functions. Having a `_routes.json` file gives developers more granular control over when your Function is invoked.
This is useful to determine if a page response should be Server-Side Rendered (SSR) or if the response should use a static-site generated (SSG) `index.html` file.

By default, the Cloudflare pages adaptor _does not_ include a `public/_routes.json` config, but rather it is auto-generated from the build by the Cloudflare adaptor. An example of an auto-generate `dist/_routes.json` would be:

```
{
"include": [
"/*"
],
"exclude": [
"/_headers",
"/_redirects",
"/build/*",
"/favicon.ico",
"/manifest.json",
"/service-worker.js",
"/about"
],
"version": 1
}
```

In the above example, it's saying _all_ pages should be SSR'd. However, the root static files such as `/favicon.ico` and any static assets in `/build/*` should be excluded from the Functions, and instead treated as a static file.

In most cases the generated `dist/_routes.json` file is ideal. However, if you need more granular control over each path, you can instead provide you're own `public/_routes.json` file. When the project provides its own `public/_routes.json` file, then the Cloudflare adaptor will not auto-generate the routes config and instead use the committed one within the `public` directory.
26 changes: 26 additions & 0 deletions adapters/cloudflare-pages/vite.config.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
import { cloudflarePagesAdapter } from '@builder.io/qwik-city/adapters/cloudflare-pages/vite';
import { extendConfig } from '@builder.io/qwik-city/vite';
import baseConfig from '../../vite.config';

export default extendConfig(baseConfig, () => {
return {
build: {
ssr: true,
rollupOptions: {
input: ['src/entry.cloudflare-pages.tsx', '@qwik-city-plan'],
},
minify: false,
},
plugins: [
cloudflarePagesAdapter({
ssg: {
include: ['/*'],
exclude: ['/', '/demo/tasks/resource/'],
origin:
(process.env.CF_PAGES_BRANCH !== 'main' ? process.env.CF_PAGES_URL : null) ??
'https://qwik.builder.io',
},
}),
],
};
});
16 changes: 16 additions & 0 deletions algolia.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
{
"index_name": "qwik",
"sitemap_urls": ["http://qwik.builder.io/sitemap.xml"],
"selectors": {
"lvl0": {
"selector": "header a.active",
"global": true
},
"lvl1": "article h1",
"lvl2": "article h2",
"lvl3": "article h3",
"lvl4": "article h4",
"lvl5": "article h5",
"text": "article p,article li,article td"
}
}
104 changes: 104 additions & 0 deletions codesandbox.sync.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
/* eslint-disable no-console */
import { readdirSync, readFileSync, writeFileSync } from 'node:fs';
import { join } from 'node:path';

function scanFiles(
path: string,
filePredicate: (path: string) => boolean,
fileCallback: (path: string) => void
) {
readdirSync(path, { withFileTypes: true }).forEach((dirent) => {
if (dirent.isDirectory()) {
scanFiles(join(path, dirent.name), filePredicate, fileCallback);
} else if (dirent.isFile()) {
if (filePredicate(dirent.name)) {
fileCallback(join(path, dirent.name));
}
}
});
}

function mdxFiles(path: string) {
return path.endsWith('.mdx');
}

function transformFile(
path: string,
lineTransformFn: (path: string, lines: string[]) => string[]
): void {
const lines = readFile(path);
const newLines = lineTransformFn(path, lines);
writeFileSync(path, newLines.join('\n'));
}

function readFile(path: string) {
try {
const file = readFileSync(path, 'utf-8');
return file.split('\n');
} catch (e) {
console.error('Error reading file: ' + path);
throw e;
}
}

function findCodeSandboxes(
codeSandboxTransformFn: (mdxPath: string, srcPath: string, lines: string[]) => string[],
mdxPath: string,
lines: string[]
): string[] {
const newLines = [];
for (let lineNo = 0; lineNo < lines.length; lineNo++) {
const line = lines[lineNo];
newLines.push(line);
const match = line.match(/(.*)<(CodeSandbox|CodeFile) src=["']([^"']*)["'][^/]*>$/);
if (match) {
const [, prefix, tag, srcPath] = match;
const content: string[] = [];
let contentEndLine: string = '';
do {
if (lineNo > lines.length) {
throw new Error(tag + ' not closed');
}
const contentLine = lines[++lineNo];
if (contentLine.match(new RegExp('</' + tag + '>$'))) {
contentEndLine = contentLine;
} else if (contentLine.startsWith(prefix)) {
content.push(contentLine.slice(prefix.length));
} else {
throw new Error(
'Expecting content of `<' +
tag +
'>` to be indented with: ' +
JSON.stringify(prefix) +
' Was: ' +
JSON.stringify(contentLine) +
' in ' +
mdxPath +
' at line ' +
lineNo
);
}
} while (!contentEndLine);
const newContent = codeSandboxTransformFn(mdxPath, srcPath, content);
newLines.push(...newContent.map((l) => prefix + l), contentEndLine);
}
}
return newLines;
}

function syncCodeSandboxes(path: string) {
transformFile(path, findCodeSandboxes.bind(null, syncCodeSandbox));
}

function syncCodeSandbox(mdxPath: string, srcPath: string, lines: string[]) {
console.log('SYNCING', mdxPath, srcPath);
const first = lines[0];
const newContent = readFile(join('.', srcPath));
while (newContent.length && newContent[newContent.length - 1] == '') {
newContent.pop();
}
const last = lines[lines.length - 1];
return [first, ...newContent, last];
}

scanFiles('src/routes', mdxFiles, syncCodeSandboxes);
94 changes: 94 additions & 0 deletions contributors.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
/* eslint-disable no-console */
import { fetch } from 'undici';
import fs from 'node:fs';
import path from 'node:path';
import url from 'node:url';
import matter from 'gray-matter';

const rootDir = path.join(path.dirname(url.fileURLToPath(import.meta.url)), '..', '..');

async function updateContributors() {
const routesDir = path.join(rootDir, 'packages', 'docs', 'src', 'routes');
await updateDocsDir(routesDir);
}

async function updateDocsDir(dir: string) {
const items = fs.readdirSync(dir);
for (const itemName of items) {
if (itemName === 'index.mdx') {
await updateGithubCommits(path.join(dir, itemName));
} else {
const itemPath = path.join(dir, itemName);
const itemStat = fs.statSync(itemPath);
if (itemStat.isDirectory()) {
await updateDocsDir(itemPath);
}
}
}
}

async function updateGithubCommits(filePath: string) {
console.log('update:', filePath);

const gm = matter.read(filePath);

const repoPath = path.relative(rootDir, filePath).replace(/\\/g, '/');
const url = new URL(`https://api.github.com/repos/BuilderIO/qwik/commits`);
url.searchParams.set('since', new Date('2022-01-01').toISOString());
url.searchParams.set('path', repoPath);

const response = await fetch(url.href);
if (response.status !== 200) {
console.log('error', response.status, response.statusText, await response.text());
await new Promise((resolve) => setTimeout(resolve, 5000));
return;
}

const commits: any = await response.json();
if (!Array.isArray(commits)) {
console.log('error', JSON.stringify(commits));
await new Promise((resolve) => setTimeout(resolve, 5000));
return;
}

const contributors: { author: string; count: number }[] = [];

for (const commit of commits) {
const author = commit?.author?.login;
if (author) {
const contributor = contributors.find((c) => c.author === author);
if (contributor) {
contributor.count++;
} else {
contributors.push({ author, count: 1 });
}
}
}

contributors.sort((a, b) => {
if (a.count > b.count) {
return -1;
}
if (a.count < b.count) {
return 1;
}
return 0;
});

gm.data.contributors = gm.data.contributors || [];
for (const contributor of contributors) {
if (!gm.data.contributors.includes(contributor.author)) {
gm.data.contributors.push(contributor.author);
}
}

const md = matter.stringify(gm.content, gm.data);

fs.writeFileSync(filePath, md);

console.log(repoPath, contributors.length);

await new Promise((resolve) => setTimeout(resolve, 1000));
}

updateContributors();
Loading

0 comments on commit c21f60e

Please sign in to comment.