chore: Add a unit testing framework (#49)
Add vitest as a unit testing framework Reviewed-on: #49
This commit was merged in pull request #49.
This commit is contained in:
19
.gitea/workflows/test.yaml
Normal file
19
.gitea/workflows/test.yaml
Normal file
@@ -0,0 +1,19 @@
|
||||
name: Test
|
||||
|
||||
on:
|
||||
push:
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
test:
|
||||
name: Run Tests
|
||||
runs-on: pi
|
||||
steps:
|
||||
- uses: actions/checkout@v6
|
||||
- uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '22'
|
||||
- run: npm install
|
||||
shell: bash
|
||||
- run: npm test
|
||||
shell: bash
|
||||
1077
.gitignore
vendored
1077
.gitignore
vendored
File diff suppressed because it is too large
Load Diff
13
.vscode/tasks.json
vendored
Normal file
13
.vscode/tasks.json
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
{
|
||||
"version": "2.0.0",
|
||||
"tasks": [
|
||||
{
|
||||
"type": "npm",
|
||||
"script": "test",
|
||||
"group": "test",
|
||||
"problemMatcher": [],
|
||||
"label": "npm: test",
|
||||
"detail": "jest"
|
||||
}
|
||||
]
|
||||
}
|
||||
84
GEMINI.md
84
GEMINI.md
@@ -1,79 +1,33 @@
|
||||
# Gemini Code Assistant Guide: `screeps-deploy-action`
|
||||
# Gemini Actions
|
||||
|
||||
This document provides a guide for Large Language Models (LLMs) and developers on understanding and interacting with the `screeps-deploy-action` project.
|
||||
This repository is maintained by Gemini.
|
||||
|
||||
## Project Overview
|
||||
## Development Guidelines
|
||||
|
||||
`screeps-deploy-action` is a GitHub Action designed to automate the deployment of JavaScript code to the online programming game Screeps. This project is aimed at supporting both GitHub and Gitea workflows, allowing developers to push their code from a Git repository directly to either the official `screeps.com` server or a private server. It utilizes **Gitea Workflows** (located in `.gitea/workflows`), which are largely compatible with GitHub Actions with minor syntax changes, for its continuous integration and deployment needs.
|
||||
* **Test-Driven Development (TDD):** Wherever possible, Test-Driven Development principles should be followed. Write tests before writing the code they are intended to validate.
|
||||
* **Pre-commit Hooks:** Ensure that `pre-commit` hooks are installed and active before making any commits. This can be done by running `pre-commit install` in your local repository.
|
||||
|
||||
The action's core logic is in `index.js`. It uses the `screeps-api` library to communicate with the Screeps server. The action is configured via a workflow file (e.g., `.github/workflows/main.yml`) using inputs defined in `action.yaml`.
|
||||
## Repository Comparison
|
||||
|
||||
### Key Files
|
||||
* On request, this repository should be compared against the rules and guidelines specified in the `README.md` of the reference repository: `https://git.horstenkamp.eu/Philipp/template-git`.
|
||||
|
||||
- **`action.yaml`**: The manifest file for the GitHub Action. It defines the inputs, outputs, and execution environment for the action.
|
||||
- **`index.js`**: The main entry point for the action. It contains the core logic for reading files, connecting to the Screeps API, and uploading the code.
|
||||
- **`package.json`**: Defines the project's metadata and dependencies. The key dependency is `screeps-api`.
|
||||
- **`README.md`**: Provides user-facing documentation, including setup and usage examples.
|
||||
## Testing
|
||||
|
||||
## Core Functionality
|
||||
This project uses [Vitest](https://vitest.dev/) for testing. The tests are located in the `__tests__` directory.
|
||||
|
||||
The action performs the following steps:
|
||||
To run the tests locally, use the following command:
|
||||
|
||||
1. **Reads Inputs**: It reads the configuration provided by the user in their workflow file. This includes server connection details, authentication credentials, and file paths.
|
||||
2. **Authentication**: It authenticates with the Screeps server using either a token or a username/password.
|
||||
3. **File Processing**:
|
||||
* It reads all `.js` files from the repository matching the provided `pattern`.
|
||||
* It can optionally perform placeholder replacements (e.g., `{{gitHash}}`, `{{deployTime}}`) in a specified file (`replace_file`) before deployment.
|
||||
4. **Code Deployment**: It uploads the processed files to the specified `branch` on the Screeps server.
|
||||
|
||||
## Usage
|
||||
|
||||
To use this action, a developer would create a `.yml` file in their `.github/workflows` directory.
|
||||
|
||||
**Example Workflow:**
|
||||
|
||||
```yaml
|
||||
name: Deploy to Screeps
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- main
|
||||
jobs:
|
||||
deploy:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- name: Deploy to screeps.com
|
||||
uses: ./
|
||||
with:
|
||||
token: ${{ secrets.SCREEPS_TOKEN }}
|
||||
branch: 'default'
|
||||
pattern: '*.js'
|
||||
```bash
|
||||
npm test
|
||||
```
|
||||
|
||||
### Configuration Inputs
|
||||
### Testing Pipeline
|
||||
|
||||
The action is configured using the `with` key in the workflow step. The available inputs are defined in `action.yaml`:
|
||||
The tests are automatically run on every push and workflow dispatch using a Gitea workflow. The workflow is defined in `.gitea/workflows/test.yaml`. All testing for this repository is done via Gitea workflows, not GitHub workflows.
|
||||
|
||||
- **`token`**: (Required) The authentication token for the Screeps API. It is recommended to store this as a secret.
|
||||
- **`protocol`**: The server protocol (`http` or `https`). Defaults to `https`.
|
||||
- **`hostname`**: The server hostname. Defaults to `screeps.com`.
|
||||
- **`port`**: The server port. Defaults to `443`.
|
||||
- **`path`**: The server path. Defaults to `/`.
|
||||
- **`username`**: The Screeps username (used if `token` is not provided).
|
||||
- **`password`**: The Screeps password (used if `token` is not provided).
|
||||
- **`branch`**: The in-game branch to deploy the code to. Defaults to `default`.
|
||||
- **`pattern`**: A glob pattern for the files to deploy. Defaults to `*.js`.
|
||||
- **`replace_file`**: Path to a file where placeholders like `{{gitHash}}` and `{{deployTime}}` should be replaced.
|
||||
- **`source_map_path`**: Path to a `main.js.map` file for Source Map support.
|
||||
The Gitea workflow does the following:
|
||||
|
||||
## Modifying the Code
|
||||
|
||||
When asked to modify the action's behavior, the primary file to edit will almost always be `index.js`.
|
||||
|
||||
- For changes to the action's inputs or outputs, `action.yaml` must also be updated.
|
||||
- The core deployment logic is within the `postCode` function in `index.js`.
|
||||
- File reading is handled by `readFilesIntoDict`.
|
||||
- Placeholder replacement is handled by `readReplaceAndWriteFiles`.
|
||||
|
||||
Before making changes, always review the existing code and the `screeps-api` documentation to understand how it interacts with the Screeps server. After making changes, ensure that any associated tests are updated or added.
|
||||
1. Checks out the code.
|
||||
2. Sets up Node.js.
|
||||
3. Installs the dependencies using `npm install`.
|
||||
4. Runs the tests using `npm test`.
|
||||
|
||||
199
__tests__/index.test.js
Normal file
199
__tests__/index.test.js
Normal file
@@ -0,0 +1,199 @@
|
||||
const {
|
||||
validateAuthentication,
|
||||
replacePlaceholders,
|
||||
readReplaceAndWriteFiles,
|
||||
readFilesIntoDict,
|
||||
} = require("../index");
|
||||
const fs = require("fs");
|
||||
const path = require("path");
|
||||
const os = require("os");
|
||||
const { glob } = require("glob");
|
||||
|
||||
describe("validateAuthentication", () => {
|
||||
it("should return null when only token is provided", () => {
|
||||
expect(validateAuthentication("token", null, null)).toBeNull();
|
||||
});
|
||||
|
||||
it("should return an error message when token and username are provided", () => {
|
||||
expect(validateAuthentication("token", "user", null)).toBe(
|
||||
"Token is defined along with username and/or password.",
|
||||
);
|
||||
});
|
||||
|
||||
it("should return an error message when token and password are provided", () => {
|
||||
expect(validateAuthentication("token", null, "pass")).toBe(
|
||||
"Token is defined along with username and/or password.",
|
||||
);
|
||||
});
|
||||
|
||||
it("should return an error message when token, username, and password are provided", () => {
|
||||
expect(validateAuthentication("token", "user", "pass")).toBe(
|
||||
"Token is defined along with username and/or password.",
|
||||
);
|
||||
});
|
||||
|
||||
it("should return an error message when no credentials are provided", () => {
|
||||
expect(validateAuthentication(null, null, null)).toBe(
|
||||
"Neither token nor password and username are defined.",
|
||||
);
|
||||
});
|
||||
|
||||
it("should return an error message when only username is provided", () => {
|
||||
expect(validateAuthentication(null, "user", null)).toBe(
|
||||
"Username is defined but no password is provided.",
|
||||
);
|
||||
});
|
||||
|
||||
it("should return an error message when only password is provided", () => {
|
||||
expect(validateAuthentication(null, null, "pass")).toBe(
|
||||
"Password is defined but no username is provided.",
|
||||
);
|
||||
});
|
||||
|
||||
it("should return null when username and password are provided", () => {
|
||||
expect(validateAuthentication(null, "user", "pass")).toBeNull();
|
||||
});
|
||||
});
|
||||
|
||||
describe("replacePlaceholders", () => {
|
||||
beforeEach(() => {
|
||||
process.env.GITHUB_SHA = "test-sha";
|
||||
process.env.GITHUB_REF = "test-ref";
|
||||
});
|
||||
|
||||
it("should replace all placeholders", () => {
|
||||
const content =
|
||||
"hash: {{gitHash}}, ref: {{gitRef}}, time: {{deployTime}}, host: {{hostname}}";
|
||||
const replacedContent = replacePlaceholders(content, "test-host");
|
||||
expect(replacedContent).toMatch(/hash: test-sha/);
|
||||
expect(replacedContent).toMatch(/ref: test-ref/);
|
||||
expect(replacedContent).toMatch(/time: .*/);
|
||||
expect(replacedContent).toMatch(/host: test-host/);
|
||||
});
|
||||
});
|
||||
|
||||
describe("readReplaceAndWriteFiles", () => {
|
||||
let tempDir;
|
||||
|
||||
beforeEach(async () => {
|
||||
tempDir = await fs.promises.mkdtemp(
|
||||
path.join(os.tmpdir(), "replace-test-"),
|
||||
);
|
||||
process.env.GITHUB_SHA = "test-sha";
|
||||
process.env.GITHUB_REF = "test-ref";
|
||||
});
|
||||
|
||||
afterEach(async () => {
|
||||
if (tempDir) {
|
||||
await fs.promises.rm(tempDir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
it("should find files and replace placeholders", async () => {
|
||||
const fileName = "test.js";
|
||||
const filePath = path.join(tempDir, fileName);
|
||||
const content = "hash: {{gitHash}}, ref: {{gitRef}}, host: {{hostname}}";
|
||||
await fs.promises.writeFile(filePath, content);
|
||||
|
||||
const pattern = "*.js";
|
||||
// We pass tempDir as the prefix so glob searches inside it
|
||||
await readReplaceAndWriteFiles(pattern, tempDir, "test-host");
|
||||
|
||||
const updatedContent = await fs.promises.readFile(filePath, "utf8");
|
||||
|
||||
expect(updatedContent).toContain("hash: test-sha");
|
||||
expect(updatedContent).toContain("ref: test-ref");
|
||||
expect(updatedContent).toContain("host: test-host");
|
||||
});
|
||||
});
|
||||
|
||||
describe("readFilesIntoDict", () => {
|
||||
let tempDir;
|
||||
|
||||
beforeEach(async () => {
|
||||
tempDir = await fs.promises.mkdtemp(path.join(os.tmpdir(), "read-test-"));
|
||||
await fs.promises.mkdir(path.join(tempDir, "subdir"), { recursive: true });
|
||||
});
|
||||
|
||||
afterEach(async () => {
|
||||
if (tempDir) {
|
||||
await fs.promises.rm(tempDir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
it("should read files into a dictionary with correct keys", async () => {
|
||||
const file1 = "file1.js";
|
||||
const content1 = "content1";
|
||||
await fs.promises.writeFile(path.join(tempDir, file1), content1);
|
||||
|
||||
const file2 = "subdir/file2.js";
|
||||
const content2 = "content2";
|
||||
await fs.promises.writeFile(path.join(tempDir, file2), content2);
|
||||
|
||||
const pattern = "**/*.js";
|
||||
const result = await readFilesIntoDict(pattern, tempDir);
|
||||
|
||||
// Keys should be relative paths without extension
|
||||
// On Windows, the path separator might differ, so we should be careful or just check contents
|
||||
|
||||
// Based on implementation:
|
||||
// key = key.slice(prefix.length);
|
||||
// key = path.basename(key, path.extname(key)); // Drop the file suffix -> THIS IS BUGGY for subdirs?
|
||||
|
||||
// Let's check the implementation of readFilesIntoDict again in index.js
|
||||
// It does: key = path.basename(key, path.extname(key));
|
||||
// This removes the directory part! So subdir/file2.js becomes file2
|
||||
|
||||
expect(result["file1"]).toBe(content1);
|
||||
expect(result["file2"]).toBe(content2);
|
||||
});
|
||||
});
|
||||
|
||||
describe("glob functionality", () => {
|
||||
let tempDir;
|
||||
|
||||
beforeEach(async () => {
|
||||
tempDir = await fs.promises.mkdtemp(path.join(os.tmpdir(), "glob-test-"));
|
||||
await fs.promises.mkdir(path.join(tempDir, "lib"), { recursive: true });
|
||||
await fs.promises.mkdir(path.join(tempDir, "deep", "folder"), {
|
||||
recursive: true,
|
||||
});
|
||||
await fs.promises.writeFile(path.join(tempDir, "main.js"), "content");
|
||||
await fs.promises.writeFile(path.join(tempDir, "utils.js"), "content");
|
||||
await fs.promises.writeFile(
|
||||
path.join(tempDir, "lib", "helper.js"),
|
||||
"content",
|
||||
);
|
||||
await fs.promises.writeFile(
|
||||
path.join(tempDir, "lib", "data.json"),
|
||||
"content",
|
||||
);
|
||||
await fs.promises.writeFile(
|
||||
path.join(tempDir, "deep", "folder", "main.js"),
|
||||
"content",
|
||||
);
|
||||
});
|
||||
|
||||
afterEach(async () => {
|
||||
if (tempDir) {
|
||||
await fs.promises.rm(tempDir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
it("should find all javascript files in the directory", async () => {
|
||||
// Ensure pattern uses forward slashes for glob
|
||||
const pattern = path.join(tempDir, "**", "*.js").split(path.sep).join("/");
|
||||
const files = await glob(pattern);
|
||||
|
||||
// Normalize file paths to system separator (backslashes on Windows)
|
||||
const normalizedFiles = files.map((f) => path.normalize(f));
|
||||
|
||||
const expectedFiles = [
|
||||
path.join(tempDir, "deep", "folder", "main.js"),
|
||||
path.join(tempDir, "lib", "helper.js"),
|
||||
path.join(tempDir, "main.js"),
|
||||
path.join(tempDir, "utils.js"),
|
||||
].sort();
|
||||
expect(normalizedFiles.sort()).toEqual(expectedFiles);
|
||||
});
|
||||
});
|
||||
718
dist/index.js
vendored
718
dist/index.js
vendored
@@ -1,6 +1,201 @@
|
||||
/******/ (() => { // webpackBootstrap
|
||||
/******/ var __webpack_modules__ = ({
|
||||
|
||||
/***/ 6136:
|
||||
/***/ ((module, __unused_webpack_exports, __nccwpck_require__) => {
|
||||
|
||||
const { ScreepsAPI } = __nccwpck_require__(9546);
|
||||
const core = __nccwpck_require__(7484);
|
||||
const fs = __nccwpck_require__(9896);
|
||||
const { glob } = __nccwpck_require__(1363);
|
||||
const path = __nccwpck_require__(6928);
|
||||
|
||||
/**
|
||||
* Replaces specific placeholder strings within the provided content with corresponding dynamic values.
|
||||
*
|
||||
* This function specifically targets three placeholders:
|
||||
* - {{gitHash}} is replaced with the current Git commit hash, obtained from the GITHUB_SHA environment variable.
|
||||
* - {{gitRef}} is replaced with the Git reference (branch or tag) that triggered the workflow, obtained from the GITHUB_REF environment variable.
|
||||
* - {{deployTime}} is replaced with the current ISO timestamp.
|
||||
*
|
||||
* Note: This function is designed for use within a GitHub Actions workflow where GITHUB_SHA and GITHUB_REF environment variables are automatically set.
|
||||
*
|
||||
* @param {string} content - The string content in which placeholders are to be replaced.
|
||||
* @returns {string} The content with placeholders replaced by their respective dynamic values.
|
||||
*/
|
||||
function replacePlaceholders(content, hostname) {
|
||||
const deployTime = new Date().toISOString();
|
||||
return content
|
||||
.replace(/{{gitHash}}/g, process.env.GITHUB_SHA)
|
||||
.replace(/{{gitRef}}/g, process.env.GITHUB_REF)
|
||||
.replace(/{{deployTime}}/g, deployTime)
|
||||
.replace(/{{hostname}}/g, hostname);
|
||||
}
|
||||
|
||||
/**
|
||||
* Reads all files matching a specified pattern, replaces certain placeholders in their content, and writes the updated content back to the files.
|
||||
*
|
||||
* This function searches for files in the filesystem using the provided glob pattern, optionally prefixed. It reads each file,
|
||||
* uses the `replacePlaceholders` function to replace specific placeholders in the file's content, and then writes the modified content
|
||||
* back to the original file. This is useful for dynamically updating file contents in a batch process, such as during a build or deployment.
|
||||
*
|
||||
* @param {string} pattern - The glob pattern used to find files. Example: '*.js' for all JavaScript files.
|
||||
* @param {string} [prefix] - An optional directory prefix to prepend to the glob pattern. This allows searching within a specific directory.
|
||||
* @returns {Promise<string[]>} A promise that resolves with an array of file paths that were processed, or rejects with an error if the process fails.
|
||||
*/
|
||||
async function readReplaceAndWriteFiles(pattern, prefix, hostname) {
|
||||
const globPattern = prefix ? path.join(prefix, pattern) : pattern;
|
||||
const files = await glob(globPattern);
|
||||
|
||||
let processPromises = files.map((file) => {
|
||||
return fs.promises.readFile(file, "utf8").then((content) => {
|
||||
content = replacePlaceholders(content, hostname);
|
||||
return fs.promises.writeFile(file, content);
|
||||
});
|
||||
});
|
||||
|
||||
await Promise.all(processPromises);
|
||||
return files;
|
||||
}
|
||||
|
||||
/**
|
||||
* Reads files matching a glob pattern into a dictionary.
|
||||
* @param {string} pattern - Glob pattern to match files.
|
||||
* @param {string} prefix - Directory prefix for file paths.
|
||||
* @returns {Promise<Object>} - Promise resolving to a dictionary of file contents keyed by filenames.
|
||||
*/
|
||||
async function readFilesIntoDict(pattern, prefix) {
|
||||
// Prepend the prefix to the glob pattern
|
||||
const globPattern = prefix ? path.join(prefix, pattern) : pattern;
|
||||
const files = await glob(globPattern);
|
||||
|
||||
let fileDict = {};
|
||||
let readPromises = files.map((file) => {
|
||||
return fs.promises.readFile(file, "utf8").then((content) => {
|
||||
// Remove the prefix from the filename and drop the file suffix
|
||||
let key = file;
|
||||
if (prefix && file.startsWith(prefix)) {
|
||||
key = key.slice(prefix.length);
|
||||
}
|
||||
key = path.basename(key, path.extname(key)); // Drop the file suffix
|
||||
|
||||
fileDict[key] = content;
|
||||
});
|
||||
});
|
||||
|
||||
await Promise.all(readPromises);
|
||||
return fileDict;
|
||||
}
|
||||
|
||||
/**
|
||||
* Validates the provided authentication credentials.
|
||||
* @param {string} token - The authentication token.
|
||||
* @param {string} username - The username.
|
||||
* @param {string} password - The password.
|
||||
* @returns {string|null} - Returns an error message if validation fails, otherwise null.
|
||||
*/
|
||||
function validateAuthentication(token, username, password) {
|
||||
if (token) {
|
||||
if (username || password) {
|
||||
return "Token is defined along with username and/or password.";
|
||||
}
|
||||
} else {
|
||||
if (!username && !password) {
|
||||
return "Neither token nor password and username are defined.";
|
||||
}
|
||||
if (username && !password) {
|
||||
return "Username is defined but no password is provided.";
|
||||
}
|
||||
if (!username && password) {
|
||||
return "Password is defined but no username is provided.";
|
||||
}
|
||||
}
|
||||
return null; // No errors found
|
||||
}
|
||||
|
||||
/**
|
||||
* Posts code to Screeps server.
|
||||
*/
|
||||
async function postCode() {
|
||||
const protocol = core.getInput("protocol") || "https";
|
||||
const hostname = core.getInput("hostname") || "screeps.com";
|
||||
const port = core.getInput("port") || "443";
|
||||
const path = core.getInput("path") || "/";
|
||||
|
||||
const token = core.getInput("token") || undefined;
|
||||
const username = core.getInput("username") || undefined;
|
||||
const password = core.getInput("password") || undefined;
|
||||
const prefix = core.getInput("source-prefix");
|
||||
const pattern = core.getInput("pattern") || "*.js";
|
||||
const branch = core.getInput("branch") || "default";
|
||||
|
||||
const gitReplace = core.getInput("git-replace") || null;
|
||||
|
||||
if (gitReplace) {
|
||||
await readReplaceAndWriteFiles(gitReplace, prefix, hostname);
|
||||
}
|
||||
|
||||
const files_to_push = await readFilesIntoDict(pattern, prefix);
|
||||
|
||||
core.info(`Trying to upload the following files to ${branch}:`);
|
||||
Object.keys(files_to_push).forEach((key) => {
|
||||
core.info(`Key: ${key}`);
|
||||
});
|
||||
|
||||
const login_arguments = {
|
||||
token: token,
|
||||
username: username,
|
||||
password: password,
|
||||
protocol: protocol,
|
||||
hostname: hostname,
|
||||
port: port,
|
||||
path: path,
|
||||
};
|
||||
|
||||
core.info("login_arguments:");
|
||||
core.info(JSON.stringify(login_arguments, null, 2));
|
||||
|
||||
const errorMessage = validateAuthentication(token, username, password);
|
||||
if (errorMessage) {
|
||||
core.error(errorMessage);
|
||||
return;
|
||||
}
|
||||
let api = new ScreepsAPI(login_arguments);
|
||||
if (token) {
|
||||
const response = await api.code.set(branch, files_to_push);
|
||||
core.info(JSON.stringify(response, null, 2));
|
||||
console.log(`Code set successfully to ${branch}`);
|
||||
} else {
|
||||
core.info(`Logging in as user ${username}`);
|
||||
await Promise.resolve()
|
||||
.then(() => api.auth(username, password, login_arguments))
|
||||
.then(() => {
|
||||
api.code.set(branch, files_to_push);
|
||||
})
|
||||
.then(() => {
|
||||
console.log(`Code set successfully to ${branch}`);
|
||||
})
|
||||
.catch((err) => {
|
||||
console.error("Error:", err);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (require.main === require.cache[eval('__filename')]) {
|
||||
postCode();
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
validateAuthentication,
|
||||
replacePlaceholders,
|
||||
postCode,
|
||||
readReplaceAndWriteFiles,
|
||||
readFilesIntoDict,
|
||||
};
|
||||
|
||||
|
||||
/***/ }),
|
||||
|
||||
/***/ 4914:
|
||||
/***/ (function(__unused_webpack_module, exports, __nccwpck_require__) {
|
||||
|
||||
@@ -8123,7 +8318,7 @@ exports.colors = [6, 2, 3, 4, 5, 1];
|
||||
try {
|
||||
// Optional dependency (as in, doesn't need to be installed, NOT like optionalDependencies in package.json)
|
||||
// eslint-disable-next-line import/no-extraneous-dependencies
|
||||
const supportsColor = __nccwpck_require__(75);
|
||||
const supportsColor = __nccwpck_require__(1450);
|
||||
|
||||
if (supportsColor && (supportsColor.stderr || supportsColor).level >= 2) {
|
||||
exports.colors = [
|
||||
@@ -9873,7 +10068,7 @@ FormData.prototype.submit = function (params, cb) {
|
||||
request.removeListener('error', callback);
|
||||
request.removeListener('response', onResponse);
|
||||
|
||||
return cb.call(this, error, responce); // eslint-disable-line no-invalid-this
|
||||
return cb.call(this, error, responce);
|
||||
};
|
||||
|
||||
onResponse = callback.bind(this, null);
|
||||
@@ -9897,7 +10092,7 @@ FormData.prototype._error = function (err) {
|
||||
FormData.prototype.toString = function () {
|
||||
return '[object FormData]';
|
||||
};
|
||||
setToStringTag(FormData, 'FormData');
|
||||
setToStringTag(FormData.prototype, 'FormData');
|
||||
|
||||
// Public API
|
||||
module.exports = FormData;
|
||||
@@ -10508,6 +10703,22 @@ if ($gOPD) {
|
||||
module.exports = $gOPD;
|
||||
|
||||
|
||||
/***/ }),
|
||||
|
||||
/***/ 3813:
|
||||
/***/ ((module) => {
|
||||
|
||||
"use strict";
|
||||
|
||||
|
||||
module.exports = (flag, argv = process.argv) => {
|
||||
const prefix = flag.startsWith('-') ? '' : (flag.length === 1 ? '-' : '--');
|
||||
const position = argv.indexOf(prefix + flag);
|
||||
const terminatorPosition = argv.indexOf('--');
|
||||
return position !== -1 && (terminatorPosition === -1 || position < terminatorPosition);
|
||||
};
|
||||
|
||||
|
||||
/***/ }),
|
||||
|
||||
/***/ 3336:
|
||||
@@ -12765,6 +12976,149 @@ class ScreepsAPI extends RawAPI {
|
||||
exports.ScreepsAPI = ScreepsAPI;
|
||||
|
||||
|
||||
/***/ }),
|
||||
|
||||
/***/ 1450:
|
||||
/***/ ((module, __unused_webpack_exports, __nccwpck_require__) => {
|
||||
|
||||
"use strict";
|
||||
|
||||
const os = __nccwpck_require__(857);
|
||||
const tty = __nccwpck_require__(2018);
|
||||
const hasFlag = __nccwpck_require__(3813);
|
||||
|
||||
const {env} = process;
|
||||
|
||||
let forceColor;
|
||||
if (hasFlag('no-color') ||
|
||||
hasFlag('no-colors') ||
|
||||
hasFlag('color=false') ||
|
||||
hasFlag('color=never')) {
|
||||
forceColor = 0;
|
||||
} else if (hasFlag('color') ||
|
||||
hasFlag('colors') ||
|
||||
hasFlag('color=true') ||
|
||||
hasFlag('color=always')) {
|
||||
forceColor = 1;
|
||||
}
|
||||
|
||||
if ('FORCE_COLOR' in env) {
|
||||
if (env.FORCE_COLOR === 'true') {
|
||||
forceColor = 1;
|
||||
} else if (env.FORCE_COLOR === 'false') {
|
||||
forceColor = 0;
|
||||
} else {
|
||||
forceColor = env.FORCE_COLOR.length === 0 ? 1 : Math.min(parseInt(env.FORCE_COLOR, 10), 3);
|
||||
}
|
||||
}
|
||||
|
||||
function translateLevel(level) {
|
||||
if (level === 0) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return {
|
||||
level,
|
||||
hasBasic: true,
|
||||
has256: level >= 2,
|
||||
has16m: level >= 3
|
||||
};
|
||||
}
|
||||
|
||||
function supportsColor(haveStream, streamIsTTY) {
|
||||
if (forceColor === 0) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
if (hasFlag('color=16m') ||
|
||||
hasFlag('color=full') ||
|
||||
hasFlag('color=truecolor')) {
|
||||
return 3;
|
||||
}
|
||||
|
||||
if (hasFlag('color=256')) {
|
||||
return 2;
|
||||
}
|
||||
|
||||
if (haveStream && !streamIsTTY && forceColor === undefined) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
const min = forceColor || 0;
|
||||
|
||||
if (env.TERM === 'dumb') {
|
||||
return min;
|
||||
}
|
||||
|
||||
if (process.platform === 'win32') {
|
||||
// Windows 10 build 10586 is the first Windows release that supports 256 colors.
|
||||
// Windows 10 build 14931 is the first release that supports 16m/TrueColor.
|
||||
const osRelease = os.release().split('.');
|
||||
if (
|
||||
Number(osRelease[0]) >= 10 &&
|
||||
Number(osRelease[2]) >= 10586
|
||||
) {
|
||||
return Number(osRelease[2]) >= 14931 ? 3 : 2;
|
||||
}
|
||||
|
||||
return 1;
|
||||
}
|
||||
|
||||
if ('CI' in env) {
|
||||
if (['TRAVIS', 'CIRCLECI', 'APPVEYOR', 'GITLAB_CI', 'GITHUB_ACTIONS', 'BUILDKITE'].some(sign => sign in env) || env.CI_NAME === 'codeship') {
|
||||
return 1;
|
||||
}
|
||||
|
||||
return min;
|
||||
}
|
||||
|
||||
if ('TEAMCITY_VERSION' in env) {
|
||||
return /^(9\.(0*[1-9]\d*)\.|\d{2,}\.)/.test(env.TEAMCITY_VERSION) ? 1 : 0;
|
||||
}
|
||||
|
||||
if (env.COLORTERM === 'truecolor') {
|
||||
return 3;
|
||||
}
|
||||
|
||||
if ('TERM_PROGRAM' in env) {
|
||||
const version = parseInt((env.TERM_PROGRAM_VERSION || '').split('.')[0], 10);
|
||||
|
||||
switch (env.TERM_PROGRAM) {
|
||||
case 'iTerm.app':
|
||||
return version >= 3 ? 3 : 2;
|
||||
case 'Apple_Terminal':
|
||||
return 2;
|
||||
// No default
|
||||
}
|
||||
}
|
||||
|
||||
if (/-256(color)?$/i.test(env.TERM)) {
|
||||
return 2;
|
||||
}
|
||||
|
||||
if (/^screen|^xterm|^vt100|^vt220|^rxvt|color|ansi|cygwin|linux/i.test(env.TERM)) {
|
||||
return 1;
|
||||
}
|
||||
|
||||
if ('COLORTERM' in env) {
|
||||
return 1;
|
||||
}
|
||||
|
||||
return min;
|
||||
}
|
||||
|
||||
function getSupportLevel(stream) {
|
||||
const level = supportsColor(stream, stream && stream.isTTY);
|
||||
return translateLevel(level);
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
supportsColor: getSupportLevel,
|
||||
stdout: translateLevel(supportsColor(true, tty.isatty(1))),
|
||||
stderr: translateLevel(supportsColor(true, tty.isatty(2)))
|
||||
};
|
||||
|
||||
|
||||
/***/ }),
|
||||
|
||||
/***/ 770:
|
||||
@@ -41339,14 +41693,6 @@ if (typeof window === "undefined" || window === null) {
|
||||
module.exports = Yaml;
|
||||
|
||||
|
||||
/***/ }),
|
||||
|
||||
/***/ 75:
|
||||
/***/ ((module) => {
|
||||
|
||||
module.exports = eval("require")("supports-color");
|
||||
|
||||
|
||||
/***/ }),
|
||||
|
||||
/***/ 2613:
|
||||
@@ -44982,18 +45328,20 @@ exports.GlobStream = GlobStream;
|
||||
*/
|
||||
Object.defineProperty(exports, "__esModule", ({ value: true }));
|
||||
exports.LRUCache = void 0;
|
||||
const perf = typeof performance === 'object' &&
|
||||
const defaultPerf = (typeof performance === 'object' &&
|
||||
performance &&
|
||||
typeof performance.now === 'function'
|
||||
? performance
|
||||
typeof performance.now === 'function') ?
|
||||
performance
|
||||
: Date;
|
||||
const warned = new Set();
|
||||
/* c8 ignore start */
|
||||
const PROCESS = (typeof process === 'object' && !!process ? process : {});
|
||||
const PROCESS = (typeof process === 'object' && !!process ?
|
||||
process
|
||||
: {});
|
||||
/* c8 ignore start */
|
||||
const emitWarning = (msg, type, code, fn) => {
|
||||
typeof PROCESS.emitWarning === 'function'
|
||||
? PROCESS.emitWarning(msg, type, code, fn)
|
||||
typeof PROCESS.emitWarning === 'function' ?
|
||||
PROCESS.emitWarning(msg, type, code, fn)
|
||||
: console.error(`[${code}] ${type}: ${msg}`);
|
||||
};
|
||||
let AC = globalThis.AbortController;
|
||||
@@ -45057,16 +45405,11 @@ const isPosInt = (n) => n && n === Math.floor(n) && n > 0 && isFinite(n);
|
||||
// zeroes at init time is brutal when you get that big.
|
||||
// But why not be complete?
|
||||
// Maybe in the future, these limits will have expanded.
|
||||
const getUintArray = (max) => !isPosInt(max)
|
||||
? null
|
||||
: max <= Math.pow(2, 8)
|
||||
? Uint8Array
|
||||
: max <= Math.pow(2, 16)
|
||||
? Uint16Array
|
||||
: max <= Math.pow(2, 32)
|
||||
? Uint32Array
|
||||
: max <= Number.MAX_SAFE_INTEGER
|
||||
? ZeroArray
|
||||
const getUintArray = (max) => !isPosInt(max) ? null
|
||||
: max <= Math.pow(2, 8) ? Uint8Array
|
||||
: max <= Math.pow(2, 16) ? Uint16Array
|
||||
: max <= Math.pow(2, 32) ? Uint32Array
|
||||
: max <= Number.MAX_SAFE_INTEGER ? ZeroArray
|
||||
: null;
|
||||
/* c8 ignore stop */
|
||||
class ZeroArray extends Array {
|
||||
@@ -45129,6 +45472,13 @@ class LRUCache {
|
||||
#disposeAfter;
|
||||
#fetchMethod;
|
||||
#memoMethod;
|
||||
#perf;
|
||||
/**
|
||||
* {@link LRUCache.OptionsBase.perf}
|
||||
*/
|
||||
get perf() {
|
||||
return this.#perf;
|
||||
}
|
||||
/**
|
||||
* {@link LRUCache.OptionsBase.ttl}
|
||||
*/
|
||||
@@ -45204,6 +45554,7 @@ class LRUCache {
|
||||
#sizes;
|
||||
#starts;
|
||||
#ttls;
|
||||
#autopurgeTimers;
|
||||
#hasDispose;
|
||||
#hasFetchMethod;
|
||||
#hasDisposeAfter;
|
||||
@@ -45222,6 +45573,7 @@ class LRUCache {
|
||||
// properties
|
||||
starts: c.#starts,
|
||||
ttls: c.#ttls,
|
||||
autopurgeTimers: c.#autopurgeTimers,
|
||||
sizes: c.#sizes,
|
||||
keyMap: c.#keyMap,
|
||||
keyList: c.#keyList,
|
||||
@@ -45297,7 +45649,13 @@ class LRUCache {
|
||||
return this.#disposeAfter;
|
||||
}
|
||||
constructor(options) {
|
||||
const { max = 0, ttl, ttlResolution = 1, ttlAutopurge, updateAgeOnGet, updateAgeOnHas, allowStale, dispose, onInsert, disposeAfter, noDisposeOnSet, noUpdateTTL, maxSize = 0, maxEntrySize = 0, sizeCalculation, fetchMethod, memoMethod, noDeleteOnFetchRejection, noDeleteOnStaleGet, allowStaleOnFetchRejection, allowStaleOnFetchAbort, ignoreFetchAbort, } = options;
|
||||
const { max = 0, ttl, ttlResolution = 1, ttlAutopurge, updateAgeOnGet, updateAgeOnHas, allowStale, dispose, onInsert, disposeAfter, noDisposeOnSet, noUpdateTTL, maxSize = 0, maxEntrySize = 0, sizeCalculation, fetchMethod, memoMethod, noDeleteOnFetchRejection, noDeleteOnStaleGet, allowStaleOnFetchRejection, allowStaleOnFetchAbort, ignoreFetchAbort, perf, } = options;
|
||||
if (perf !== undefined) {
|
||||
if (typeof perf?.now !== 'function') {
|
||||
throw new TypeError('perf option must have a now() method if specified');
|
||||
}
|
||||
}
|
||||
this.#perf = perf ?? defaultPerf;
|
||||
if (max !== 0 && !isPosInt(max)) {
|
||||
throw new TypeError('max option must be a nonnegative integer');
|
||||
}
|
||||
@@ -45317,13 +45675,11 @@ class LRUCache {
|
||||
throw new TypeError('sizeCalculation set to non-function');
|
||||
}
|
||||
}
|
||||
if (memoMethod !== undefined &&
|
||||
typeof memoMethod !== 'function') {
|
||||
if (memoMethod !== undefined && typeof memoMethod !== 'function') {
|
||||
throw new TypeError('memoMethod must be a function if defined');
|
||||
}
|
||||
this.#memoMethod = memoMethod;
|
||||
if (fetchMethod !== undefined &&
|
||||
typeof fetchMethod !== 'function') {
|
||||
if (fetchMethod !== undefined && typeof fetchMethod !== 'function') {
|
||||
throw new TypeError('fetchMethod must be a function if specified');
|
||||
}
|
||||
this.#fetchMethod = fetchMethod;
|
||||
@@ -45378,9 +45734,7 @@ class LRUCache {
|
||||
this.updateAgeOnGet = !!updateAgeOnGet;
|
||||
this.updateAgeOnHas = !!updateAgeOnHas;
|
||||
this.ttlResolution =
|
||||
isPosInt(ttlResolution) || ttlResolution === 0
|
||||
? ttlResolution
|
||||
: 1;
|
||||
isPosInt(ttlResolution) || ttlResolution === 0 ? ttlResolution : 1;
|
||||
this.ttlAutopurge = !!ttlAutopurge;
|
||||
this.ttl = ttl || 0;
|
||||
if (this.ttl) {
|
||||
@@ -45415,10 +45769,21 @@ class LRUCache {
|
||||
const starts = new ZeroArray(this.#max);
|
||||
this.#ttls = ttls;
|
||||
this.#starts = starts;
|
||||
this.#setItemTTL = (index, ttl, start = perf.now()) => {
|
||||
const purgeTimers = this.ttlAutopurge ?
|
||||
new Array(this.#max)
|
||||
: undefined;
|
||||
this.#autopurgeTimers = purgeTimers;
|
||||
this.#setItemTTL = (index, ttl, start = this.#perf.now()) => {
|
||||
starts[index] = ttl !== 0 ? start : 0;
|
||||
ttls[index] = ttl;
|
||||
if (ttl !== 0 && this.ttlAutopurge) {
|
||||
// clear out the purge timer if we're setting TTL to 0, and
|
||||
// previously had a ttl purge timer running, so it doesn't
|
||||
// fire unnecessarily.
|
||||
if (purgeTimers?.[index]) {
|
||||
clearTimeout(purgeTimers[index]);
|
||||
purgeTimers[index] = undefined;
|
||||
}
|
||||
if (ttl !== 0 && purgeTimers) {
|
||||
const t = setTimeout(() => {
|
||||
if (this.#isStale(index)) {
|
||||
this.#delete(this.#keyList[index], 'expire');
|
||||
@@ -45430,10 +45795,11 @@ class LRUCache {
|
||||
t.unref();
|
||||
}
|
||||
/* c8 ignore stop */
|
||||
purgeTimers[index] = t;
|
||||
}
|
||||
};
|
||||
this.#updateItemAge = index => {
|
||||
starts[index] = ttls[index] !== 0 ? perf.now() : 0;
|
||||
starts[index] = ttls[index] !== 0 ? this.#perf.now() : 0;
|
||||
};
|
||||
this.#statusTTL = (status, index) => {
|
||||
if (ttls[index]) {
|
||||
@@ -45453,7 +45819,7 @@ class LRUCache {
|
||||
// that costly call repeatedly.
|
||||
let cachedNow = 0;
|
||||
const getNow = () => {
|
||||
const n = perf.now();
|
||||
const n = this.#perf.now();
|
||||
if (this.ttlResolution > 0) {
|
||||
cachedNow = n;
|
||||
const t = setTimeout(() => (cachedNow = 0), this.ttlResolution);
|
||||
@@ -45621,8 +45987,7 @@ class LRUCache {
|
||||
*keys() {
|
||||
for (const i of this.#indexes()) {
|
||||
const k = this.#keyList[i];
|
||||
if (k !== undefined &&
|
||||
!this.#isBackgroundFetch(this.#valList[i])) {
|
||||
if (k !== undefined && !this.#isBackgroundFetch(this.#valList[i])) {
|
||||
yield k;
|
||||
}
|
||||
}
|
||||
@@ -45636,8 +46001,7 @@ class LRUCache {
|
||||
*rkeys() {
|
||||
for (const i of this.#rindexes()) {
|
||||
const k = this.#keyList[i];
|
||||
if (k !== undefined &&
|
||||
!this.#isBackgroundFetch(this.#valList[i])) {
|
||||
if (k !== undefined && !this.#isBackgroundFetch(this.#valList[i])) {
|
||||
yield k;
|
||||
}
|
||||
}
|
||||
@@ -45649,8 +46013,7 @@ class LRUCache {
|
||||
*values() {
|
||||
for (const i of this.#indexes()) {
|
||||
const v = this.#valList[i];
|
||||
if (v !== undefined &&
|
||||
!this.#isBackgroundFetch(this.#valList[i])) {
|
||||
if (v !== undefined && !this.#isBackgroundFetch(this.#valList[i])) {
|
||||
yield this.#valList[i];
|
||||
}
|
||||
}
|
||||
@@ -45664,8 +46027,7 @@ class LRUCache {
|
||||
*rvalues() {
|
||||
for (const i of this.#rindexes()) {
|
||||
const v = this.#valList[i];
|
||||
if (v !== undefined &&
|
||||
!this.#isBackgroundFetch(this.#valList[i])) {
|
||||
if (v !== undefined && !this.#isBackgroundFetch(this.#valList[i])) {
|
||||
yield this.#valList[i];
|
||||
}
|
||||
}
|
||||
@@ -45690,9 +46052,7 @@ class LRUCache {
|
||||
find(fn, getOptions = {}) {
|
||||
for (const i of this.#indexes()) {
|
||||
const v = this.#valList[i];
|
||||
const value = this.#isBackgroundFetch(v)
|
||||
? v.__staleWhileFetching
|
||||
: v;
|
||||
const value = this.#isBackgroundFetch(v) ? v.__staleWhileFetching : v;
|
||||
if (value === undefined)
|
||||
continue;
|
||||
if (fn(value, this.#keyList[i], this)) {
|
||||
@@ -45714,9 +46074,7 @@ class LRUCache {
|
||||
forEach(fn, thisp = this) {
|
||||
for (const i of this.#indexes()) {
|
||||
const v = this.#valList[i];
|
||||
const value = this.#isBackgroundFetch(v)
|
||||
? v.__staleWhileFetching
|
||||
: v;
|
||||
const value = this.#isBackgroundFetch(v) ? v.__staleWhileFetching : v;
|
||||
if (value === undefined)
|
||||
continue;
|
||||
fn.call(thisp, value, this.#keyList[i], this);
|
||||
@@ -45729,9 +46087,7 @@ class LRUCache {
|
||||
rforEach(fn, thisp = this) {
|
||||
for (const i of this.#rindexes()) {
|
||||
const v = this.#valList[i];
|
||||
const value = this.#isBackgroundFetch(v)
|
||||
? v.__staleWhileFetching
|
||||
: v;
|
||||
const value = this.#isBackgroundFetch(v) ? v.__staleWhileFetching : v;
|
||||
if (value === undefined)
|
||||
continue;
|
||||
fn.call(thisp, value, this.#keyList[i], this);
|
||||
@@ -45768,17 +46124,18 @@ class LRUCache {
|
||||
if (i === undefined)
|
||||
return undefined;
|
||||
const v = this.#valList[i];
|
||||
const value = this.#isBackgroundFetch(v)
|
||||
? v.__staleWhileFetching
|
||||
: v;
|
||||
/* c8 ignore start - this isn't tested for the info function,
|
||||
* but it's the same logic as found in other places. */
|
||||
const value = this.#isBackgroundFetch(v) ? v.__staleWhileFetching : v;
|
||||
if (value === undefined)
|
||||
return undefined;
|
||||
/* c8 ignore end */
|
||||
const entry = { value };
|
||||
if (this.#ttls && this.#starts) {
|
||||
const ttl = this.#ttls[i];
|
||||
const start = this.#starts[i];
|
||||
if (ttl && start) {
|
||||
const remain = ttl - (perf.now() - start);
|
||||
const remain = ttl - (this.#perf.now() - start);
|
||||
entry.ttl = remain;
|
||||
entry.start = Date.now();
|
||||
}
|
||||
@@ -45806,9 +46163,7 @@ class LRUCache {
|
||||
for (const i of this.#indexes({ allowStale: true })) {
|
||||
const key = this.#keyList[i];
|
||||
const v = this.#valList[i];
|
||||
const value = this.#isBackgroundFetch(v)
|
||||
? v.__staleWhileFetching
|
||||
: v;
|
||||
const value = this.#isBackgroundFetch(v) ? v.__staleWhileFetching : v;
|
||||
if (value === undefined || key === undefined)
|
||||
continue;
|
||||
const entry = { value };
|
||||
@@ -45816,7 +46171,7 @@ class LRUCache {
|
||||
entry.ttl = this.#ttls[i];
|
||||
// always dump the start relative to a portable timestamp
|
||||
// it's ok for this to be a bit slow, it's a rare operation.
|
||||
const age = perf.now() - this.#starts[i];
|
||||
const age = this.#perf.now() - this.#starts[i];
|
||||
entry.start = Math.floor(Date.now() - age);
|
||||
}
|
||||
if (this.#sizes) {
|
||||
@@ -45846,7 +46201,7 @@ class LRUCache {
|
||||
//
|
||||
// it's ok for this to be a bit slow, it's a rare operation.
|
||||
const age = Date.now() - entry.start;
|
||||
entry.start = perf.now() - age;
|
||||
entry.start = this.#perf.now() - age;
|
||||
}
|
||||
this.set(key, entry.value, entry);
|
||||
}
|
||||
@@ -45903,12 +46258,9 @@ class LRUCache {
|
||||
let index = this.#size === 0 ? undefined : this.#keyMap.get(k);
|
||||
if (index === undefined) {
|
||||
// addition
|
||||
index = (this.#size === 0
|
||||
? this.#tail
|
||||
: this.#free.length !== 0
|
||||
? this.#free.pop()
|
||||
: this.#size === this.#max
|
||||
? this.#evict(false)
|
||||
index = (this.#size === 0 ? this.#tail
|
||||
: this.#free.length !== 0 ? this.#free.pop()
|
||||
: this.#size === this.#max ? this.#evict(false)
|
||||
: this.#size);
|
||||
this.#keyList[index] = k;
|
||||
this.#valList[index] = v;
|
||||
@@ -45955,8 +46307,8 @@ class LRUCache {
|
||||
this.#valList[index] = v;
|
||||
if (status) {
|
||||
status.set = 'replace';
|
||||
const oldValue = oldVal && this.#isBackgroundFetch(oldVal)
|
||||
? oldVal.__staleWhileFetching
|
||||
const oldValue = oldVal && this.#isBackgroundFetch(oldVal) ?
|
||||
oldVal.__staleWhileFetching
|
||||
: oldVal;
|
||||
if (oldValue !== undefined)
|
||||
status.oldValue = oldValue;
|
||||
@@ -46033,6 +46385,10 @@ class LRUCache {
|
||||
}
|
||||
}
|
||||
this.#removeItemSize(head);
|
||||
if (this.#autopurgeTimers?.[head]) {
|
||||
clearTimeout(this.#autopurgeTimers[head]);
|
||||
this.#autopurgeTimers[head] = undefined;
|
||||
}
|
||||
// if we aren't about to use the index, then null these out
|
||||
if (free) {
|
||||
this.#keyList[head] = undefined;
|
||||
@@ -46105,8 +46461,7 @@ class LRUCache {
|
||||
peek(k, peekOptions = {}) {
|
||||
const { allowStale = this.allowStale } = peekOptions;
|
||||
const index = this.#keyMap.get(k);
|
||||
if (index === undefined ||
|
||||
(!allowStale && this.#isStale(index))) {
|
||||
if (index === undefined || (!allowStale && this.#isStale(index))) {
|
||||
return;
|
||||
}
|
||||
const v = this.#valList[index];
|
||||
@@ -46148,9 +46503,13 @@ class LRUCache {
|
||||
}
|
||||
// either we didn't abort, and are still here, or we did, and ignored
|
||||
const bf = p;
|
||||
if (this.#valList[index] === p) {
|
||||
// if nothing else has been written there but we're set to update the
|
||||
// cache and ignore the abort, or if it's still pending on this specific
|
||||
// background request, then write it to the cache.
|
||||
const vl = this.#valList[index];
|
||||
if (vl === p || (ignoreAbort && updateCache && vl === undefined)) {
|
||||
if (v === undefined) {
|
||||
if (bf.__staleWhileFetching) {
|
||||
if (bf.__staleWhileFetching !== undefined) {
|
||||
this.#valList[index] = bf.__staleWhileFetching;
|
||||
}
|
||||
else {
|
||||
@@ -46212,8 +46571,7 @@ class LRUCache {
|
||||
// defer check until we are actually aborting,
|
||||
// so fetchMethod can override.
|
||||
ac.signal.addEventListener('abort', () => {
|
||||
if (!options.ignoreFetchAbort ||
|
||||
options.allowStaleOnFetchAbort) {
|
||||
if (!options.ignoreFetchAbort || options.allowStaleOnFetchAbort) {
|
||||
res(undefined);
|
||||
// when it eventually resolves, update the cache.
|
||||
if (options.allowStaleOnFetchAbort) {
|
||||
@@ -46445,6 +46803,10 @@ class LRUCache {
|
||||
if (this.#size !== 0) {
|
||||
const index = this.#keyMap.get(k);
|
||||
if (index !== undefined) {
|
||||
if (this.#autopurgeTimers?.[index]) {
|
||||
clearTimeout(this.#autopurgeTimers?.[index]);
|
||||
this.#autopurgeTimers[index] = undefined;
|
||||
}
|
||||
deleted = true;
|
||||
if (this.#size === 1) {
|
||||
this.#clear(reason);
|
||||
@@ -46520,6 +46882,11 @@ class LRUCache {
|
||||
if (this.#ttls && this.#starts) {
|
||||
this.#ttls.fill(0);
|
||||
this.#starts.fill(0);
|
||||
for (const t of this.#autopurgeTimers ?? []) {
|
||||
if (t !== undefined)
|
||||
clearTimeout(t);
|
||||
}
|
||||
this.#autopurgeTimers?.fill(undefined);
|
||||
}
|
||||
if (this.#sizes) {
|
||||
this.#sizes.fill(0);
|
||||
@@ -49579,7 +49946,7 @@ const entToType = (s) => s.isFile() ? IFREG
|
||||
: s.isFIFO() ? IFIFO
|
||||
: UNKNOWN;
|
||||
// normalize unicode path names
|
||||
const normalizeCache = new Map();
|
||||
const normalizeCache = new lru_cache_1.LRUCache({ max: 2 ** 12 });
|
||||
const normalize = (s) => {
|
||||
const c = normalizeCache.get(s);
|
||||
if (c)
|
||||
@@ -49588,7 +49955,7 @@ const normalize = (s) => {
|
||||
normalizeCache.set(s, n);
|
||||
return n;
|
||||
};
|
||||
const normalizeNocaseCache = new Map();
|
||||
const normalizeNocaseCache = new lru_cache_1.LRUCache({ max: 2 ** 12 });
|
||||
const normalizeNocase = (s) => {
|
||||
const c = normalizeNocaseCache.get(s);
|
||||
if (c)
|
||||
@@ -49779,6 +50146,7 @@ class PathBase {
|
||||
get parentPath() {
|
||||
return (this.parent || this).fullpath();
|
||||
}
|
||||
/* c8 ignore start */
|
||||
/**
|
||||
* Deprecated alias for Dirent['parentPath'] Somewhat counterintuitively,
|
||||
* this property refers to the *parent* path, not the path object itself.
|
||||
@@ -49788,6 +50156,7 @@ class PathBase {
|
||||
get path() {
|
||||
return this.parentPath;
|
||||
}
|
||||
/* c8 ignore stop */
|
||||
/**
|
||||
* Do not create new Path objects directly. They should always be accessed
|
||||
* via the PathScurry class or other methods on the Path class.
|
||||
@@ -51543,185 +51912,12 @@ module.exports = /*#__PURE__*/JSON.parse('{"application/1d-interleaved-parityfec
|
||||
/******/ if (typeof __nccwpck_require__ !== 'undefined') __nccwpck_require__.ab = __dirname + "/";
|
||||
/******/
|
||||
/************************************************************************/
|
||||
var __webpack_exports__ = {};
|
||||
const { ScreepsAPI } = __nccwpck_require__(9546);
|
||||
const core = __nccwpck_require__(7484);
|
||||
const fs = __nccwpck_require__(9896);
|
||||
const { glob } = __nccwpck_require__(1363);
|
||||
const path = __nccwpck_require__(6928);
|
||||
|
||||
/**
|
||||
* Replaces specific placeholder strings within the provided content with corresponding dynamic values.
|
||||
*
|
||||
* This function specifically targets three placeholders:
|
||||
* - {{gitHash}} is replaced with the current Git commit hash, obtained from the GITHUB_SHA environment variable.
|
||||
* - {{gitRef}} is replaced with the Git reference (branch or tag) that triggered the workflow, obtained from the GITHUB_REF environment variable.
|
||||
* - {{deployTime}} is replaced with the current ISO timestamp.
|
||||
*
|
||||
* Note: This function is designed for use within a GitHub Actions workflow where GITHUB_SHA and GITHUB_REF environment variables are automatically set.
|
||||
*
|
||||
* @param {string} content - The string content in which placeholders are to be replaced.
|
||||
* @returns {string} The content with placeholders replaced by their respective dynamic values.
|
||||
*/
|
||||
function replacePlaceholders(content, hostname) {
|
||||
const deployTime = new Date().toISOString();
|
||||
return content
|
||||
.replace(/{{gitHash}}/g, process.env.GITHUB_SHA)
|
||||
.replace(/{{gitRef}}/g, process.env.GITHUB_REF)
|
||||
.replace(/{{deployTime}}/g, deployTime)
|
||||
.replace(/{{hostname}}/g, hostname);
|
||||
}
|
||||
|
||||
/**
|
||||
* Reads all files matching a specified pattern, replaces certain placeholders in their content, and writes the updated content back to the files.
|
||||
*
|
||||
* This function searches for files in the filesystem using the provided glob pattern, optionally prefixed. It reads each file,
|
||||
* uses the `replacePlaceholders` function to replace specific placeholders in the file's content, and then writes the modified content
|
||||
* back to the original file. This is useful for dynamically updating file contents in a batch process, such as during a build or deployment.
|
||||
*
|
||||
* @param {string} pattern - The glob pattern used to find files. Example: '*.js' for all JavaScript files.
|
||||
* @param {string} [prefix] - An optional directory prefix to prepend to the glob pattern. This allows searching within a specific directory.
|
||||
* @returns {Promise<string[]>} A promise that resolves with an array of file paths that were processed, or rejects with an error if the process fails.
|
||||
*/
|
||||
async function readReplaceAndWriteFiles(pattern, prefix, hostname) {
|
||||
const globPattern = prefix ? path.join(prefix, pattern) : pattern;
|
||||
const files = await glob(globPattern);
|
||||
|
||||
let processPromises = files.map((file) => {
|
||||
return fs.promises.readFile(file, "utf8").then((content) => {
|
||||
content = replacePlaceholders(content, hostname);
|
||||
return fs.promises.writeFile(file, content);
|
||||
});
|
||||
});
|
||||
|
||||
await Promise.all(processPromises);
|
||||
return files;
|
||||
}
|
||||
|
||||
/**
|
||||
* Reads files matching a glob pattern into a dictionary.
|
||||
* @param {string} pattern - Glob pattern to match files.
|
||||
* @param {string} prefix - Directory prefix for file paths.
|
||||
* @returns {Promise<Object>} - Promise resolving to a dictionary of file contents keyed by filenames.
|
||||
*/
|
||||
async function readFilesIntoDict(pattern, prefix) {
|
||||
// Prepend the prefix to the glob pattern
|
||||
const globPattern = prefix ? path.join(prefix, pattern) : pattern;
|
||||
const files = await glob(globPattern);
|
||||
|
||||
let fileDict = {};
|
||||
let readPromises = files.map((file) => {
|
||||
return fs.promises.readFile(file, "utf8").then((content) => {
|
||||
// Remove the prefix from the filename and drop the file suffix
|
||||
let key = file;
|
||||
if (prefix && file.startsWith(prefix)) {
|
||||
key = key.slice(prefix.length);
|
||||
}
|
||||
key = path.basename(key, path.extname(key)); // Drop the file suffix
|
||||
|
||||
fileDict[key] = content;
|
||||
});
|
||||
});
|
||||
|
||||
await Promise.all(readPromises);
|
||||
return fileDict;
|
||||
}
|
||||
|
||||
/**
|
||||
* Validates the provided authentication credentials.
|
||||
* @param {string} token - The authentication token.
|
||||
* @param {string} username - The username.
|
||||
* @param {string} password - The password.
|
||||
* @returns {string|null} - Returns an error message if validation fails, otherwise null.
|
||||
*/
|
||||
function validateAuthentication(token, username, password) {
|
||||
if (token) {
|
||||
if (username || password) {
|
||||
return "Token is defined along with username and/or password.";
|
||||
}
|
||||
} else {
|
||||
if (!username && !password) {
|
||||
return "Neither token nor password and username are defined.";
|
||||
}
|
||||
if (username && !password) {
|
||||
return "Username is defined but no password is provided.";
|
||||
}
|
||||
if (!username && password) {
|
||||
return "Password is defined but no username is provided.";
|
||||
}
|
||||
}
|
||||
return null; // No errors found
|
||||
}
|
||||
|
||||
/**
|
||||
* Posts code to Screeps server.
|
||||
*/
|
||||
async function postCode() {
|
||||
const protocol = core.getInput("protocol") || "https";
|
||||
const hostname = core.getInput("hostname") || "screeps.com";
|
||||
const port = core.getInput("port") || "443";
|
||||
const path = core.getInput("path") || "/";
|
||||
|
||||
const token = core.getInput("token") || undefined;
|
||||
const username = core.getInput("username") || undefined;
|
||||
const password = core.getInput("password") || undefined;
|
||||
const prefix = core.getInput("source-prefix");
|
||||
const pattern = core.getInput("pattern") || "*.js";
|
||||
const branch = core.getInput("branch") || "default";
|
||||
|
||||
const gitReplace = core.getInput("git-replace") || null;
|
||||
|
||||
if (gitReplace) {
|
||||
await readReplaceAndWriteFiles(gitReplace, prefix, hostname);
|
||||
}
|
||||
|
||||
const files_to_push = await readFilesIntoDict(pattern, prefix);
|
||||
|
||||
core.info(`Trying to upload the following files to ${branch}:`);
|
||||
Object.keys(files_to_push).forEach((key) => {
|
||||
core.info(`Key: ${key}`);
|
||||
});
|
||||
|
||||
const login_arguments = {
|
||||
token: token,
|
||||
username: username,
|
||||
password: password,
|
||||
protocol: protocol,
|
||||
hostname: hostname,
|
||||
port: port,
|
||||
path: path,
|
||||
};
|
||||
|
||||
core.info("login_arguments:");
|
||||
core.info(JSON.stringify(login_arguments, null, 2));
|
||||
|
||||
const errorMessage = validateAuthentication(token, username, password);
|
||||
if (errorMessage) {
|
||||
core.error(errorMessage);
|
||||
return;
|
||||
}
|
||||
let api = new ScreepsAPI(login_arguments);
|
||||
if (token) {
|
||||
const response = await api.code.set(branch, files_to_push);
|
||||
core.info(JSON.stringify(response, null, 2));
|
||||
console.log(`Code set successfully to ${branch}`);
|
||||
} else {
|
||||
core.info(`Logging in as user ${username}`);
|
||||
await Promise.resolve()
|
||||
.then(() => api.auth(username, password, login_arguments))
|
||||
.then(() => {
|
||||
api.code.set(branch, files_to_push);
|
||||
})
|
||||
.then(() => {
|
||||
console.log(`Code set successfully to ${branch}`);
|
||||
})
|
||||
.catch((err) => {
|
||||
console.error("Error:", err);
|
||||
});
|
||||
}
|
||||
}
|
||||
postCode();
|
||||
|
||||
module.exports = __webpack_exports__;
|
||||
/******/
|
||||
/******/ // startup
|
||||
/******/ // Load entry module and return exports
|
||||
/******/ // This entry module is referenced by other modules so it can't be inlined
|
||||
/******/ var __webpack_exports__ = __nccwpck_require__(6136);
|
||||
/******/ module.exports = __webpack_exports__;
|
||||
/******/
|
||||
/******/ })()
|
||||
;
|
||||
11
index.js
11
index.js
@@ -174,4 +174,15 @@ async function postCode() {
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (require.main === module) {
|
||||
postCode();
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
validateAuthentication,
|
||||
replacePlaceholders,
|
||||
postCode,
|
||||
readReplaceAndWriteFiles,
|
||||
readFilesIntoDict,
|
||||
};
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
{"8b776b0173f34b8e7d376c35dbd515022335073f":{"files":{"index.js":["ZVPfPRYPn3JntuOZs2WuhZTA+Pg=",true]},"modified":1766795348715}}
|
||||
@@ -1 +0,0 @@
|
||||
{"b03bf7d94b68d6efff8cb09552f1880aa62ea1f0":{"files":{"index.js":["Y4JfDP5X7/wr1mlYLpop4yMG/vA=",true],"dist/index.js":["0uW46uAJG8qyUnSoKEh8QWxHJ4A=",true]},"modified":1766842302598}}
|
||||
1762
package-lock.json
generated
1762
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@@ -5,6 +5,7 @@
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"start": "node index.js",
|
||||
"test": "vitest run --globals --coverage",
|
||||
"build": "ncc build index.js -o dist --external utf-8-validate --external bufferutil"
|
||||
},
|
||||
"dependencies": {
|
||||
@@ -13,6 +14,8 @@
|
||||
"screeps-api": "^1.7.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@vercel/ncc": "^0.38.4"
|
||||
"@vercel/ncc": "^0.38.4",
|
||||
"@vitest/coverage-v8": "^4.0.16",
|
||||
"vitest": "^4.0.16"
|
||||
}
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user