Compare commits

..

17 Commits

Author SHA1 Message Date
Thomas Hobson 8727a545c6
api: disable networking during execute 2021-02-21 14:56:07 +13:00
Thomas Hobson 2f64f23896
api: container hardening 2021-02-21 14:25:03 +13:00
Thomas Hobson 5ac1285534
api: lint 2021-02-21 14:15:48 +13:00
Thomas Hobson 72f57ef1ce
docs: readme 2021-02-21 13:37:21 +13:00
Thomas Hobson ac46c1b5bb
api: read both stdout and stderr 2021-02-21 13:32:35 +13:00
Thomas Hobson f957019710
deploy: docker compose file 2021-02-21 13:15:27 +13:00
Thomas Hobson 7b2305f30c
api: add licence to package.json 2021-02-21 13:15:11 +13:00
Thomas Hobson 60b258f57c
repo: Automated local repository builder 2021-02-21 12:58:18 +13:00
Thomas Hobson 816efaff3b
pkg(python *): correct environment 2021-02-21 12:57:40 +13:00
Thomas Hobson 233fb9bf26
api: trim whitespace off env vars 2021-02-21 12:57:20 +13:00
Thomas Hobson cdc65d6605
api: use bash to call run/compile script 2021-02-21 12:57:02 +13:00
Thomas Hobson b20f853ef1
api: fix function name + allow unsigned packages 2021-02-21 12:56:35 +13:00
Thomas Hobson 8ad62ec983
api: use patched nocamel for fs/promises 2021-02-21 12:06:20 +13:00
Thomas Hobson 60c004eea9
api: lint **everything** 2021-02-21 11:39:03 +13:00
Thomas Hobson 216451d1aa
pkg: add tar.gz unpack rule 2021-02-21 03:29:47 +13:00
Thomas Hobson f1c082bfa1
fix(python *): fix python rules 2021-02-21 03:29:32 +13:00
Thomas Hobson 291cbe8c50
pkg: fix secondary rules 2021-02-21 03:29:13 +13:00
28 changed files with 1084 additions and 570 deletions

1
.gitignore vendored Normal file
View File

@ -0,0 +1 @@
data/

238
README.MD Normal file
View File

@ -0,0 +1,238 @@
<h1 align="center">
<a href="https://github.com/engineer-man/piston"><img src="docs/images/icon_circle.svg" width="25" height="25" alt="engineer-man piston"></a>
Piston
</h1>
<h3 align="center">A high performance general purpose code execution engine.</h3>
<br>
<p align="center">
<a href="https://github.com/engineer-man/piston/commits/master">
<img src="https://img.shields.io/github/last-commit/engineer-man/piston.svg?style=for-the-badge&logo=github&logoColor=white"
alt="GitHub last commit">
<a href="https://github.com/engineer-man/piston/issues">
<img src="https://img.shields.io/github/issues/engineer-man/piston.svg?style=for-the-badge&logo=github&logoColor=white"
alt="GitHub issues">
<a href="https://github.com/engineer-man/piston/pulls">
<img src="https://img.shields.io/github/issues-pr-raw/engineer-man/piston.svg?style=for-the-badge&logo=github&logoColor=white"
alt="GitHub pull requests">
</p>
---
<h4 align="center">
<a href="#About">About</a> •
<a href="#Public-API">Public API</a> •
<a href="#Getting-Started">Getting Started</a> •
<a href="#Usage">Usage</a> •
<a href="#Supported-Languages">Supported Languages</a> •
<a href="#Principle-of-Operation">Principles</a> •
<a href="#Security">Security</a> •
<a href="#License">License</a>
</h4>
---
<br>
# About
<h4>
Piston is a high performance general purpose code execution engine. It excels at running untrusted and
possibly malicious code without fear from any harmful effects.
</h4>
<br>
It's used in numerous places including:
* [EMKC Challenges](https://emkc.org/challenges),
* [EMKC Weekly Contests](https://emkc.org/contests),
* [Engineer Man Discord Server](https://discord.gg/engineerman),
* [I Run Code (Discord Bot)](https://github.com/engineer-man/piston-bot) bot as well as 1300+ other servers
and 100+ direct integrations.
To get it in your own server, go here: https://emkc.org/run.
<br>
# Public API
- Requires no installation and you can use it immediately.
- Reference the Versions/Execute sections below to learn about the request and response formats.
<br>
When using the public Piston API, use the base URL:
```
https://emkc.org/api/v1/piston
```
#### GET
```
https://emkc.org/api/v1/piston/versions
```
#### POST
```
https://emkc.org/api/v1/piston/execute
```
> Important Note: The Piston API is rate limited to 5 requests per second. If you have a need for more requests than that
and it's for a good cause, please reach out to me (EngineerMan#0001) on [Discord](https://discord.gg/engineerman)
so we can discuss potentially getting you an unlimited key.
<br>
# Getting Started
### Host System Package Dependencies
- Docker
- Docker Compose
- Node JS
#### After system dependencies are installed, clone this repository:
```sh
# clone and enter repo
git clone https://github.com/engineer-man/piston
```
#### Installation
- docker-compose up
#### CLI Usage
- `cli/execute [language] [file path] [args]`
<br>
# Usage
### CLI
```sh
lxc/execute [language] [file path] [args]
```
### API
To use the API, it must first be started. Please note that if root is required to access
LXC then the API must also be running as root. To start the API, run the following:
```
cd api
./start
```
For your own local installation, the API is available at:
```
http://127.0.0.1:2000
```
#### Versions Endpoint
`GET /versions`
This endpoint will return the supported languages along with the current version and aliases. To execute
code for a particular language using the `/execute` endpoint, either the name or one of the aliases must
be provided.
```json
HTTP/1.1 200 OK
Content-Type: application/json
[
{
"name": "awk",
"aliases": ["awk"],
"version": "1.3.3"
},
{
"name": "bash",
"aliases": ["bash"],
"version": "4.4.20"
},
{
"name": "c",
"aliases": ["c"],
"version": "7.5.0"
}
]
```
#### Execute Endpoint
`POST /execute`
This endpoint requests execution of some arbitrary code.
- `language` (**required**) The language to use for execution, must be a string and supported by Piston (see list below).
- `source` (**required**) The source code to execute, must be a string.
- `stdin` (*optional*) The text to pass as stdin to the program. Must be a string or left out of the request.
- `args` (*optional*) The arguments to pass to the program. Must be an array or left out of the request.
```json
{
"language": "js",
"source": "console.log(process.argv)",
"stdin": "",
"args": [
"1",
"2",
"3"
]
}
```
A typical response upon successful execution will contain the `language`, `version`, `output` which
is a combination of both `stdout` and `stderr` but in chronological order according to program output,
as well as separate `stdout` and `stderr`.
```json
HTTP/1.1 200 OK
Content-Type: application/json
{
"ran": true,
"language": "js",
"version": "12.13.0",
"output": "[ '/usr/bin/node',\n '/tmp/code.code',\n '1',\n '2',\n '3' ]",
"stdout": "[ '/usr/bin/node',\n '/tmp/code.code',\n '1',\n '2',\n '3' ]",
"stderr": ""
}
```
If a problem exists with the request, a `400` status code is returned and the reason in the `message` key.
```json
HTTP/1.1 400 Bad Request
Content-Type: application/json
{
"message": "Supplied language is not supported by Piston"
}
```
<br>
# Supported Languages
`python`,
<br>
<!--
# Principle of Operation
Piston utilizes LXC as the primary mechanism for sandboxing. There is a small API written in Node which takes
in execution requests and executes them in the container. High level, the API writes
a temporary source and args file to `/tmp` and that gets mounted read-only along with the execution scripts into the container.
The source file is either ran or compiled and ran (in the case of languages like c, c++, c#, go, etc.).
<br>
<!--
# Security
LXC provides a great deal of security out of the box in that it's separate from the system.
Piston takes additional steps to make it resistant to
various privilege escalation, denial-of-service, and resource saturation threats. These steps include:
- Disabling outgoing network interaction
- Capping max processes at 64 (resists `:(){ :|: &}:;`, `while True: os.fork()`, etc.)
- Capping max files at 2048 (resists various file based attacks)
- Mounting all resources read-only (resists `sudo rm -rf --no-preserve-root /`)
- Cleaning up all temp space after each execution (resists out of drive space attacks)
- Running as a variety of unprivileged users
- Capping runtime execution at 3 seconds
- Capping stdout to 65536 characters (resists yes/no bombs and runaway output)
- SIGKILLing misbehaving code
-->
<br>
<!-- Someone please do this -->
# License
Piston is licensed under the MIT license.

View File

@ -8,6 +8,7 @@
"snakecasejs" "snakecasejs"
], ],
"extends": "eslint:recommended", "extends": "eslint:recommended",
"parser": "babel-eslint",
"parserOptions": { "parserOptions": {
"ecmaVersion": 12 "ecmaVersion": 12
}, },
@ -27,11 +28,11 @@
], ],
"quotes": [ "quotes": [
"error", "error",
"double" "single"
], ],
"semi": [ "semi": [
"error", "error",
"never" "always"
], ],
"no-unused-vars": ["error", { "argsIgnorePattern": "^_"}], "no-unused-vars": ["error", { "argsIgnorePattern": "^_"}],
"snakecasejs/snakecasejs": "warn" "snakecasejs/snakecasejs": "warn"

View File

@ -1,5 +1,13 @@
FROM node:15.8.0-alpine3.13 FROM node:15.8.0-alpine3.13
RUN apk add --no-cache gnupg tar bash coreutils RUN apk add --no-cache gnupg tar bash coreutils shadow
RUN for i in $(seq 1000 1500); do \
groupadd -g $i runner$i && \
useradd -M runner$i -g $i -u $i && \
echo "runner$i soft nproc 64" >> /etc/security/limits.conf && \
echo "runner$i hard nproc 64" >> /etc/security/limits.conf && \
echo "runner$i soft nofile 2048" >> /etc/security/limits.conf && \
echo "runner$i hard nofile 2048" >> /etc/security/limits.conf ;\
done
ENV NODE_ENV=production ENV NODE_ENV=production
WORKDIR /piston_api WORKDIR /piston_api

View File

@ -9,14 +9,16 @@
"is-docker": "^2.1.1", "is-docker": "^2.1.1",
"js-yaml": "^4.0.0", "js-yaml": "^4.0.0",
"logplease": "^1.2.15", "logplease": "^1.2.15",
"nocamel": "*", "nocamel": "HexF/nocamel#patch-1",
"node-fetch": "^2.6.1", "node-fetch": "^2.6.1",
"semver": "^7.3.4", "semver": "^7.3.4",
"uuid": "^8.3.2", "uuid": "^8.3.2",
"yargs": "^16.2.0" "yargs": "^16.2.0"
}, },
"devDependencies": { "devDependencies": {
"babel-eslint": "^10.1.0",
"eslint": "^7.20.0", "eslint": "^7.20.0",
"eslint-plugin-snakecasejs": "^2.2.0" "eslint-plugin-snakecasejs": "^2.2.0"
} },
"license": "MIT"
} }

View File

@ -1,54 +1,55 @@
const globals = require("./globals") const globals = require('./globals');
const logger = require("logplease").create("cache") const logger = require('logplease').create('cache');
const fs = require("fs"), path = require("path") const fs = require('fs/promises'),
const util = require("util") fss = require('fs'),
path = require('path');
const cache = new Map() const cache = new Map();
module.exports = { module.exports = {
cache_key: (context, key) => Buffer.from(`${context}-${key}`).toString("base64"), cache_key: (context, key) => Buffer.from(`${context}-${key}`).toString('base64'),
has(key){ has(key){
return cache.has(key) && cache.get(key).expiry > Date.now() return cache.has(key) && cache.get(key).expiry > Date.now();
}, },
async get(key, callback, ttl=globals.cache_ttl){ async get(key, callback, ttl=globals.cache_ttl){
logger.debug("get:", key) logger.debug('get:', key);
if(module.exports.has(key)){ if(module.exports.has(key)){
logger.debug("hit:",key) logger.debug('hit:',key);
return cache.get(key).data return cache.get(key).data;
} }
logger.debug("miss:", key) logger.debug('miss:', key);
var data = await callback() var data = await callback();
cache.set(key, {data, expiry: Date.now() + ttl}) cache.set(key, {data, expiry: Date.now() + ttl});
return data return data;
}, },
async flush(cache_dir){ async flush(cache_dir){
logger.info("Flushing cache") logger.info('Flushing cache');
cache.forEach((v,k)=>{ cache.forEach((v,k)=>{
var file_path = path.join(cache_dir, k) var file_path = path.join(cache_dir, k);
if(v.expiry < Date.now()){ if(v.expiry < Date.now()){
//remove from cache //remove from cache
cache.delete(k) cache.delete(k);
fs.stat(file_path, (err, stats)=>{ fs.stat(file_path, (err, stats)=>{
if(err) return //ignore - probably hasn't been flushed yet if(err) return; //ignore - probably hasn't been flushed yet
if(stats.is_file()) if(stats.is_file())
fs.rm(file_path, (err)=>{ fs.rm(file_path, (err)=>{
if(err) logger.warn(`Couldn't clean up on-disk cache file ${k}`) if(err) logger.warn(`Couldn't clean up on-disk cache file ${k}`);
}) });
}) });
}else{ }else{
//flush to disk //flush to disk
fs.write_file(file_path, JSON.stringify(v),()=>{}) fs.write_file(file_path, JSON.stringify(v),()=>{});
} }
}) });
}, },
async load(cache_dir){ async load(cache_dir){
return util.promisify(fs.readdir)(cache_dir) return fs.readdir(cache_dir)
.then(files => Promise.all(files.map( .then(files => Promise.all(files.map(
async file => { async file => {
cache.set(file, JSON.parse(fs.read_file_sync(path.join(cache_dir,file)).toString())) cache.set(file, JSON.parse(fss.read_file_sync(path.join(cache_dir,file)).toString()));
} }
))) )));
} }
} };

View File

@ -1,9 +1,9 @@
const fs = require("fs") const fss = require('fs');
const yargs = require("yargs") const yargs = require('yargs');
const hide_bin = require("yargs/helpers").hideBin //eslint-disable-line snakecasejs/snakecasejs const hide_bin = require('yargs/helpers').hideBin; //eslint-disable-line snakecasejs/snakecasejs
const Logger = require("logplease") const Logger = require('logplease');
const logger = Logger.create("config") const logger = Logger.create('config');
const yaml = require("js-yaml") const yaml = require('js-yaml');
const header = `# const header = `#
# ____ _ _ # ____ _ _
@ -16,151 +16,151 @@ const header = `#
# github.com/engineer-man/piston # github.com/engineer-man/piston
# #
` `;
const argv = yargs(hide_bin(process.argv)) const argv = yargs(hide_bin(process.argv))
.usage("Usage: $0 -c [config]") .usage('Usage: $0 -c [config]')
.demandOption("c") //eslint-disable-line snakecasejs/snakecasejs .demandOption('c') //eslint-disable-line snakecasejs/snakecasejs
.option("config", { .option('config', {
alias: "c", alias: 'c',
describe: "config file to load from", describe: 'config file to load from',
default: "/piston/config.yaml" default: '/piston/config.yaml'
}) })
.option("make-config", { .option('make-config', {
alias: "m", alias: 'm',
type: "boolean", type: 'boolean',
describe: "create config file and populate defaults if it does not already exist" describe: 'create config file and populate defaults if it does not already exist'
}).argv }).argv;
const options = [ const options = [
{ {
key: "log_level", key: 'log_level',
desc: "Level of data to log", desc: 'Level of data to log',
default: "INFO", default: 'INFO',
/* eslint-disable snakecasejs/snakecasejs */ /* eslint-disable snakecasejs/snakecasejs */
options: Object.values(Logger.LogLevels), options: Object.values(Logger.LogLevels),
validators: [x=>Object.values(Logger.LogLevels).includes(x) || `Log level ${x} does not exist`] validators: [x=>Object.values(Logger.LogLevels).includes(x) || `Log level ${x} does not exist`]
/* eslint-enable snakecasejs/snakecasejs */ /* eslint-enable snakecasejs/snakecasejs */
}, },
{ {
key: "bind_address", key: 'bind_address',
desc: "Address to bind REST API on\nThank @Bones for the number", desc: 'Address to bind REST API on\nThank @Bones for the number',
default: "0.0.0.0:6969", default: '0.0.0.0:6969',
validators: [] validators: []
}, },
{ {
key: "data_directory", key: 'data_directory',
desc: "Absolute path to store all piston related data at", desc: 'Absolute path to store all piston related data at',
default: "/piston", default: '/piston',
validators: [x=> fs.exists_sync(x) || `Directory ${x} does not exist`] validators: [x=> fss.exists_sync(x) || `Directory ${x} does not exist`]
}, },
{ {
key: "cache_ttl", key: 'cache_ttl',
desc: "Time in milliseconds to keep data in cache for at a maximum", desc: 'Time in milliseconds to keep data in cache for at a maximum',
default: 60 * 60 * 1000, default: 60 * 60 * 1000,
validators: [] validators: []
}, },
{ {
key: "cache_flush_time", key: 'cache_flush_time',
desc: "Interval in milliseconds to flush cache to disk at", desc: 'Interval in milliseconds to flush cache to disk at',
default: 90 * 60 * 1000, //90 minutes default: 90 * 60 * 1000, //90 minutes
validators: [] validators: []
}, },
{ {
key: "state_flush_time", key: 'state_flush_time',
desc: "Interval in milliseconds to flush state to disk at", desc: 'Interval in milliseconds to flush state to disk at',
default: 5000, // 5 seconds (file is tiny) default: 5000, // 5 seconds (file is tiny)
validators: [] validators: []
}, },
{ {
key: "runner_uid_min", key: 'runner_uid_min',
desc: "Minimum uid to use for runner", desc: 'Minimum uid to use for runner',
default: 1000, default: 1000,
validators: [] validators: []
}, },
{ {
key: "runner_uid_max", key: 'runner_uid_max',
desc: "Maximum uid to use for runner", desc: 'Maximum uid to use for runner',
default: 1500, default: 1500,
validators: [] validators: []
}, },
{ {
key: "runner_gid_min", key: 'runner_gid_min',
desc: "Minimum gid to use for runner", desc: 'Minimum gid to use for runner',
default: 1000, default: 1000,
validators: [] validators: []
}, },
{ {
key: "runner_gid_max", key: 'runner_gid_max',
desc: "Maximum gid to use for runner", desc: 'Maximum gid to use for runner',
default: 1500, default: 1500,
validators: [] validators: []
} }
] ];
const default_config = [ const default_config = [
...header.split("\n"), ...header.split('\n'),
...options.map(option => ` ...options.map(option => `
${[ ${[
...option.desc.split("\n"), ...option.desc.split('\n'),
option.options?("Options: " + option.options.join(", ")):"" option.options?('Options: ' + option.options.join(', ')):''
].filter(x=>x.length>0).map(x=>`# ${x}`).join("\n")} ].filter(x=>x.length>0).map(x=>`# ${x}`).join('\n')}
${option.key}: ${option.default} ${option.key}: ${option.default}
`)].join("\n") `)].join('\n');
logger.info(`Loading Configuration from ${argv.config}`) logger.info(`Loading Configuration from ${argv.config}`);
!!argv["make-config"] && logger.debug("Make configuration flag is set") !!argv['make-config'] && logger.debug('Make configuration flag is set');
if(!!argv["make-config"] && !fs.exists_sync(argv.config)){ if(!!argv['make-config'] && !fss.exists_sync(argv.config)){
logger.info("Writing default configuration...") logger.info('Writing default configuration...');
try { try {
fs.write_file_sync(argv.config, default_config) fss.write_file_sync(argv.config, default_config);
} catch (err) { } catch (err) {
logger.error("Error writing default configuration:", err.message) logger.error('Error writing default configuration:', err.message);
process.exit(1) process.exit(1);
} }
} }
var config = {} var config = {};
logger.debug("Reading config file") logger.debug('Reading config file');
try{ try{
const cfg_content = fs.read_file_sync(argv.config) const cfg_content = fss.read_file_sync(argv.config);
try{ try{
config = yaml.load(cfg_content) config = yaml.load(cfg_content);
}catch(err){ }catch(err){
logger.error("Error parsing configuration file:", err.message) logger.error('Error parsing configuration file:', err.message);
process.exit(1) process.exit(1);
} }
}catch(err){ }catch(err){
logger.error("Error reading configuration from disk:", err.message) logger.error('Error reading configuration from disk:', err.message);
process.exit(1) process.exit(1);
} }
logger.debug("Validating config entries") logger.debug('Validating config entries');
var errored=false var errored=false;
options.forEach(opt => { options.forEach(opt => {
logger.debug("Checking key",opt.key) logger.debug('Checking key',opt.key);
var cfg_val = config[opt.key] var cfg_val = config[opt.key];
if(cfg_val == undefined){ if(cfg_val == undefined){
errored = true errored = true;
logger.error(`Config key ${opt.key} does not exist on currently loaded configuration`) logger.error(`Config key ${opt.key} does not exist on currently loaded configuration`);
return return;
} }
opt.validators.forEach(validator => { opt.validators.forEach(validator => {
var response = validator(cfg_val) var response = validator(cfg_val);
if(response !== true){ if(response !== true){
errored = true errored = true;
logger.error(`Config key ${opt.key} failed validation:`, response) logger.error(`Config key ${opt.key} failed validation:`, response);
return return;
} }
}) });
}) });
if(errored) process.exit(1) if(errored) process.exit(1);
logger.info("Configuration successfully loaded") logger.info('Configuration successfully loaded');
module.exports = config module.exports = config;

View File

@ -1,146 +1,146 @@
const logger = require("logplease").create("executor/job") const logger = require('logplease').create('executor/job');
const { v4: uuidv4 } = require("uuid") const { v4: uuidv4 } = require('uuid');
const cp = require("child_process") const cp = require('child_process');
const path = require("path") const path = require('path');
const config = require("../config"); const config = require('../config');
const globals = require("../globals"); const globals = require('../globals');
const fs = require("fs"); const fs = require('fs/promises');
const util = require("util");
const job_states = { const job_states = {
READY: Symbol("Ready to be primed"), READY: Symbol('Ready to be primed'),
PRIMED: Symbol("Primed and ready for execution"), PRIMED: Symbol('Primed and ready for execution'),
EXECUTED: Symbol("Executed and ready for cleanup") EXECUTED: Symbol('Executed and ready for cleanup')
} };
var uid=0; var uid=0;
var gid=0; var gid=0;
class Job { class Job {
constructor(runtime, files, args, stdin, timeouts, main){ constructor(runtime, files, args, stdin, timeouts, main){
this.uuid = uuidv4() this.uuid = uuidv4();
this.runtime = runtime this.runtime = runtime;
this.files = files this.files = files;
this.args = args this.args = args;
this.stdin = stdin this.stdin = stdin;
this.timeouts = timeouts this.timeouts = timeouts;
this.main = main this.main = main;
if(!Object.keys(this.files).includes(this.main)) if(!this.files.map(f=>f.name).includes(this.main))
throw new Error(`Main file "${this.main}" will not be written to disk`) throw new Error(`Main file "${this.main}" will not be written to disk`);
this.uid = config.runner_uid_min + uid; this.uid = config.runner_uid_min + uid;
this.gid = config.runner_gid_min + gid; this.gid = config.runner_gid_min + gid;
uid++ uid++;
gid++ gid++;
uid %= (config.runner_uid_max - config.runner_uid_min) + 1 uid %= (config.runner_uid_max - config.runner_uid_min) + 1;
gid %= (config.runner_gid_max - config.runner_gid_min) + 1 gid %= (config.runner_gid_max - config.runner_gid_min) + 1;
this.state = job_states.READY; this.state = job_states.READY;
this.dir = path.join(config.data_directory, globals.data_directories.jobs, this.uuid); this.dir = path.join(config.data_directory, globals.data_directories.jobs, this.uuid);
} }
async prime(){ async prime(){
logger.info(`Priming job uuid=${this.uuid}`) logger.info(`Priming job uuid=${this.uuid}`);
logger.debug("Writing files to job cache") logger.debug('Writing files to job cache');
await util.promisify(fs.mkdir)(this.dir, {mode:0o700}) await fs.mkdir(this.dir, {mode:0o700});
const files = Object.keys(this.files).map(fileName => { const files = this.files.map(({name: file_name, content}) => {
var content = this.files[fileName]; return fs.write_file(path.join(this.dir, file_name), content);
return util.promisify(fs.writeFile)(path.join(this.dir, fileName), content) });
})
await Promise.all(files) await Promise.all(files);
logger.debug(`Transfering ownership uid=${this.uid} gid=${this.gid}`) logger.debug(`Transfering ownership uid=${this.uid} gid=${this.gid}`);
await util.promisify(fs.chown)(this.dir, this.uid, this.gid) await fs.chown(this.dir, this.uid, this.gid);
const chowns = Object.keys(this.files).map(fileName => { const chowns = this.files.map(({name:file_name}) => {
return util.promisify(fs.chown)(path.join(this.dir, fileName), this.uid, this.gid) return fs.chown(path.join(this.dir, file_name), this.uid, this.gid);
}) });
await Promise.all(chowns) await Promise.all(chowns);
this.state = job_states.PRIMED; this.state = job_states.PRIMED;
logger.debug("Primed job") logger.debug('Primed job');
} }
async execute(){ async execute(){
if(this.state != job_states.PRIMED) throw new Error("Job must be in primed state, current state: " + this.state.toString()) if(this.state != job_states.PRIMED) throw new Error('Job must be in primed state, current state: ' + this.state.toString());
logger.info(`Executing job uuid=${this.uuid} uid=${this.uid} gid=${this.gid} runtime=${this.runtime.toString()}`) logger.info(`Executing job uuid=${this.uuid} uid=${this.uid} gid=${this.gid} runtime=${this.runtime.toString()}`);
logger.debug(`Compiling`) logger.debug('Compiling');
const compile = this.runtime.compiled && await new Promise((resolve, reject) => { const compile = this.runtime.compiled && await new Promise((resolve, reject) => {
var stderr, stdout = ""; var stdout = '';
const proc = cp.spawn(this.runtime.pkgdir, [this.main, ...this.args] ,{ var stderr = '';
const proc = cp.spawn('unshare', ['-n', 'bash', path.join(this.runtime.pkgdir, 'compile'),this.main, ...this.files] ,{
env: this.runtime.env_vars, env: this.runtime.env_vars,
stdio: ['pipe', 'pipe', 'pipe'], stdio: ['pipe', 'pipe', 'pipe'],
cwd: this.dir, cwd: this.dir,
uid: this.uid, uid: this.uid,
gid: this.gid gid: this.gid
}) });
const killTimeout = setTimeout(proc.kill, this.timeouts.compile, "SIGKILL") const kill_timeout = setTimeout(proc.kill, this.timeouts.compile, 'SIGKILL');
proc.stderr.on('data', d=>stderr += d) proc.stderr.on('data', d=>stderr += d);
proc.stdout.on('data', d=>stdout += d) proc.stdout.on('data', d=>stdout += d);
proc.on('exit', (code, signal)=>{ proc.on('exit', (code, signal)=>{
clearTimeout(killTimeout); clearTimeout(kill_timeout);
resolve({stdout, stderr, code, signal}) resolve({stdout, stderr, code, signal});
}) });
proc.on('error', (code, signal) => { proc.on('error', (err) => {
clearTimeout(killTimeout); clearTimeout(kill_timeout);
reject({stdout, stderr, code, signal}) reject({error: err, stdout, stderr});
}) });
}) });
logger.debug("Running") logger.debug('Running');
const run = await new Promise((resolve, reject) => { const run = await new Promise((resolve, reject) => {
var stderr, stdout = ""; var stdout = '';
const proc = cp.spawn('bash', [path.join(this.runtime.pkgdir, "run"), this.main, ...this.args] ,{ var stderr = '';
const proc = cp.spawn('unshare', ['-n', 'bash', path.join(this.runtime.pkgdir, 'run'),this.main, ...this.args] ,{
env: this.runtime.env_vars, env: this.runtime.env_vars,
stdio: ['pipe', 'pipe', 'pipe'], stdio: ['pipe', 'pipe', 'pipe'],
cwd: this.dir, cwd: this.dir,
uid: this.uid, uid: this.uid,
gid: this.gid gid: this.gid
}) });
const killTimeout = setTimeout(proc.kill, this.timeouts.run, "SIGKILL") const kill_timeout = setTimeout(proc.kill, this.timeouts.run, 'SIGKILL');
proc.stderr.on('data', d=>stderr += d) proc.stderr.on('data', d=>stderr += d);
proc.stdout.on('data', d=>stdout += d) proc.stdout.on('data', d=>stdout += d);
proc.on('exit', (code, signal)=>{ proc.on('exit', (code, signal)=>{
clearTimeout(killTimeout); clearTimeout(kill_timeout);
resolve({stdout, stderr, code, signal}) resolve({stdout, stderr, code, signal});
}) });
proc.on('error', (code, signal) => { proc.on('error', (err) => {
clearTimeout(killTimeout); clearTimeout(kill_timeout);
reject({stdout, stderr, code, signal}) reject({error: err, stdout, stderr});
}) });
}) });
this.state = job_states.EXECUTED; this.state = job_states.EXECUTED;
return { return {
compile, compile,
run run
} };
} }
async cleanup(){ async cleanup(){
logger.info(`Cleaning up job uuid=${this.uuid}`) logger.info(`Cleaning up job uuid=${this.uuid}`);
await util.promisify(fs.rm)(this.dir, {recursive: true, force: true}) await fs.rm(this.dir, {recursive: true, force: true});
} }
} }
module.exports = {Job} module.exports = {Job};

View File

@ -1,34 +1,34 @@
// {"language":"python","version":"3.9.1","files":{"code.py":"print('hello world')"},"args":[],"stdin":"","compile_timeout":10, "run_timeout":3, "main": "code.py"} // {"language":"python","version":"3.9.1","files":{"code.py":"print('hello world')"},"args":[],"stdin":"","compile_timeout":10, "run_timeout":3, "main": "code.py"}
// {"success":true, "run":{"stdout":"hello world", "stderr":"", "error_code":0},"compile":{"stdout":"","stderr":"","error_code":0}} // {"success":true, "run":{"stdout":"hello world", "stderr":"", "error_code":0},"compile":{"stdout":"","stderr":"","error_code":0}}
const { get_latest_runtime_matching_language_version } = require("../runtime"); const { get_latest_runtime_matching_language_version } = require('../runtime');
const { Job } = require("./job"); const { Job } = require('./job');
module.exports = { module.exports = {
async run_job(req, res){ async run_job(req, res){
// POST /jobs // POST /jobs
var errored = false; var errored = false;
["language", "version", ['language', 'version',
"files", "main", 'files', 'main',
"args", "stdin", 'args', 'stdin',
"compile_timeout", "run_timeout", 'compile_timeout', 'run_timeout',
].forEach(key => { ].forEach(key => {
if(req.body[key] == undefined) errored = errored || res.json_error(`${key} is required`, 400) if(req.body[key] == undefined) errored = errored || res.json_error(`${key} is required`, 400);
}) });
if(errored) return errored; if(errored) return errored;
const runtime = get_latest_runtime_matching_language_version(req.body.language, req.body.version); const runtime = get_latest_runtime_matching_language_version(req.body.language, req.body.version);
if(runtime == undefined) return res.json_error(`${req.body.language}-${req.body.version} runtime is unknown`, 400) if(runtime == undefined) return res.json_error(`${req.body.language}-${req.body.version} runtime is unknown`, 400);
const job = new Job(runtime, req.body.files, req.body.args, req.body.stdin, {run: req.body.run_timeout, compile: req.body.compile_timeout}, req.body.main) const job = new Job(runtime, req.body.files, req.body.args, req.body.stdin, {run: req.body.run_timeout, compile: req.body.compile_timeout}, req.body.main);
await job.prime() await job.prime();
const result = await job.execute() const result = await job.execute();
res.json_success(result) res.json_success(result);
await job.cleanup() await job.cleanup();
} }
} };

View File

@ -1,25 +1,25 @@
// Globals are things the user shouldn't change in config, but is good to not use inline constants for // Globals are things the user shouldn't change in config, but is good to not use inline constants for
const is_docker = require("is-docker") const is_docker = require('is-docker');
const fs = require("fs") const fss = require('fs');
const platform = `${is_docker() ? "docker" : "baremetal"}-${ const platform = `${is_docker() ? 'docker' : 'baremetal'}-${
fs.read_file_sync("/etc/os-release") fss.read_file_sync('/etc/os-release')
.toString() .toString()
.split("\n") .split('\n')
.find(x=>x.startsWith("ID")) .find(x=>x.startsWith('ID'))
.replace("ID=","") .replace('ID=','')
}` }`;
module.exports = { module.exports = {
data_directories: { data_directories: {
cache: "cache", cache: 'cache',
packages: "packages", packages: 'packages',
runtimes: "runtimes", runtimes: 'runtimes',
jobs: "jobs" jobs: 'jobs'
}, },
data_files:{ data_files:{
state: "state.json" state: 'state.json'
}, },
version: require("../package.json").version, version: require('../package.json').version,
platform, platform,
pkg_installed_file: ".ppman-installed" //Used as indication for if a package was installed pkg_installed_file: '.ppman-installed' //Used as indication for if a package was installed
} };

View File

@ -1,34 +1,33 @@
const fs = require("fs"), const fs = require('fs/promises'),
path= require("path"), path= require('path'),
util = require("util"), fetch = require('node-fetch'),
fetch = require("node-fetch"), urlp = require('url');
urlp = require("url")
module.exports = { module.exports = {
async buffer_from_u_r_l(url){ async buffer_from_url(url){
if(!(url instanceof URL)) if(!(url instanceof URL))
url = new URL(url) url = new URL(url);
if(url.protocol == "file:"){ if(url.protocol == 'file:'){
//eslint-disable-next-line snakecasejs/snakecasejs //eslint-disable-next-line snakecasejs/snakecasejs
return await util.promisify(fs.read_file)(urlp.fileURLToPath(url)) return await fs.read_file(urlp.fileURLToPath(url));
}else{ }else{
return await fetch({ return await fetch({
url: url.toString() url: url.toString()
}) });
} }
}, },
add_url_base_if_required(url, base){ add_url_base_if_required(url, base){
try{ try{
return new URL(url) return new URL(url);
}catch{ }catch{
//Assume this is a file name //Assume this is a file name
return new URL(url, base + "/") return new URL(url, base + '/');
} }
}, },
url_basename(url){ url_basename(url){
return path.basename(url.pathname) return path.basename(url.pathname);
}, },
} };

View File

@ -1,106 +1,105 @@
#!/usr/bin/env node #!/usr/bin/env node
require("nocamel") require('nocamel');
const Logger = require("logplease") const Logger = require('logplease');
const express = require("express") const express = require('express');
const globals = require("./globals") const globals = require('./globals');
const config = require("./config") const config = require('./config');
const cache = require("./cache") const cache = require('./cache');
const state = require("./state") const state = require('./state');
const path = require("path") const path = require('path');
const fs = require("fs") const fs = require('fs/promises');
const util = require("util") const fss = require('fs');
const body_parser = require("body-parser") const body_parser = require('body-parser');
const runtime = require("./runtime") const runtime = require('./runtime');
const logger = Logger.create("index") const logger = Logger.create('index');
const app = express(); const app = express();
(async () => { (async () => {
logger.info("Setting loglevel to",config.log_level) logger.info('Setting loglevel to',config.log_level);
Logger.setLogLevel(config.log_level) //eslint-disable-line snakecasejs/snakecasejs Logger.setLogLevel(config.log_level); //eslint-disable-line snakecasejs/snakecasejs
logger.debug("Ensuring data directories exist") logger.debug('Ensuring data directories exist');
Object.values(globals.data_directories).forEach(dir => { Object.values(globals.data_directories).forEach(dir => {
var data_path = path.join(config.data_directory, dir) var data_path = path.join(config.data_directory, dir);
logger.debug(`Ensuring ${data_path} exists`) logger.debug(`Ensuring ${data_path} exists`);
if(!fs.exists_sync(data_path)){ if(!fss.exists_sync(data_path)){
logger.info(`${data_path} does not exist.. Creating..`) logger.info(`${data_path} does not exist.. Creating..`);
try{ try{
fs.mkdir_sync(data_path) fss.mkdir_sync(data_path);
}catch(err){ }catch(err){
logger.error(`Failed to create ${data_path}: `, err.message) logger.error(`Failed to create ${data_path}: `, err.message);
} }
} }
}) });
logger.info("Loading state") logger.info('Loading state');
await state.load(path.join(config.data_directory,globals.data_files.state)) await state.load(path.join(config.data_directory,globals.data_files.state));
logger.info("Loading cache") logger.info('Loading cache');
await cache.load(path.join(config.data_directory,globals.data_directories.cache)) await cache.load(path.join(config.data_directory,globals.data_directories.cache));
logger.info("Loading packages") logger.info('Loading packages');
const pkgdir = path.join(config.data_directory,globals.data_directories.packages) const pkgdir = path.join(config.data_directory,globals.data_directories.packages);
await util.promisify(fs.readdir)(pkgdir) await fs.readdir(pkgdir)
.then(langs => Promise.all( .then(langs => Promise.all(
langs.map(lang=> langs.map(lang=>
util.promisify(fs.readdir)(path.join(pkgdir,lang)) fs.readdir(path.join(pkgdir,lang))
.then(x=>x.map(y=>path.join(pkgdir, lang, y))) .then(x=>x.map(y=>path.join(pkgdir, lang, y)))
))) )))
//eslint-disable-next-line snakecasejs/snakecasejs .then(pkgs=>pkgs.flat().filter(pkg=>fss.exists_sync(path.join(pkg, globals.pkg_installed_file))))
.then(pkgs=>pkgs.flat().filter(pkg=>fs.existsSync(path.join(pkg, globals.pkg_installed_file)))) .then(pkgs=>pkgs.forEach(pkg => new runtime.Runtime(pkg)));
.then(pkgs=>pkgs.forEach(pkg => new runtime.Runtime(pkg)))
logger.info("Starting API Server") logger.info('Starting API Server');
logger.debug("Constructing Express App") logger.debug('Constructing Express App');
logger.debug("Registering middleware") logger.debug('Registering middleware');
app.use(body_parser.urlencoded({extended: true})) app.use(body_parser.urlencoded({extended: true}));
app.use(body_parser.json()) app.use(body_parser.json());
logger.debug("Registering custom message wrappers") logger.debug('Registering custom message wrappers');
express.response.json_error = function(message, code) { express.response.json_error = function(message, code) {
this.status(code) this.status(code);
return this.json({success: false, message, code}) return this.json({success: false, message, code});
} };
express.response.json_success = function(obj) { express.response.json_success = function(obj) {
return this.json({success: true, data: obj}) return this.json({success: true, data: obj});
} };
logger.debug("Registering Routes") logger.debug('Registering Routes');
const ppman_routes = require("./ppman/routes") const ppman_routes = require('./ppman/routes');
app.get ("/repos", ppman_routes.repo_list) app.get ('/repos', ppman_routes.repo_list);
app.post ("/repos", ppman_routes.repo_add) app.post ('/repos', ppman_routes.repo_add);
app.get ("/repos/:repo_slug", ppman_routes.repo_info) app.get ('/repos/:repo_slug', ppman_routes.repo_info);
app.get ("/repos/:repo_slug/packages", ppman_routes.repo_packages) app.get ('/repos/:repo_slug/packages', ppman_routes.repo_packages);
app.get ("/repos/:repo_slug/packages/:language/:version", ppman_routes.package_info) app.get ('/repos/:repo_slug/packages/:language/:version', ppman_routes.package_info);
app.post ("/repos/:repo_slug/packages/:language/:version", ppman_routes.package_install) app.post ('/repos/:repo_slug/packages/:language/:version', ppman_routes.package_install);
app.delete("/repos/:repo_slug/packages/:language/:version", ppman_routes.package_uninstall) //TODO app.delete('/repos/:repo_slug/packages/:language/:version', ppman_routes.package_uninstall); //TODO
const executor_routes = require('./executor/routes') const executor_routes = require('./executor/routes');
app.post ("/jobs", executor_routes.run_job) app.post ('/jobs', executor_routes.run_job);
logger.debug("Calling app.listen") logger.debug('Calling app.listen');
const [address,port] = config.bind_address.split(":") const [address,port] = config.bind_address.split(':');
app.listen(port, address, ()=>{ app.listen(port, address, ()=>{
logger.info("API server started on", config.bind_address) logger.info('API server started on', config.bind_address);
}) });
logger.debug("Setting up flush timers") logger.debug('Setting up flush timers');
setInterval(cache.flush,config.cache_flush_time,path.join(config.data_directory,globals.data_directories.cache)) setInterval(cache.flush,config.cache_flush_time,path.join(config.data_directory,globals.data_directories.cache));
setInterval(state.save,config.state_flush_time,path.join(config.data_directory,globals.data_files.state)) setInterval(state.save,config.state_flush_time,path.join(config.data_directory,globals.data_files.state));
})() })();

View File

@ -1,170 +1,172 @@
const logger = require("logplease").create("ppman/package") const logger = require('logplease').create('ppman/package');
const semver = require("semver") const semver = require('semver');
const config = require("../config") const config = require('../config');
const globals = require("../globals") const globals = require('../globals');
const helpers = require("../helpers") const helpers = require('../helpers');
const path = require("path") const path = require('path');
const fs = require("fs") const fs = require('fs/promises');
const util = require("util") const fss = require('fs');
const cp = require("child_process") const cp = require('child_process');
const crypto = require("crypto") const crypto = require('crypto');
const runtime = require("../runtime") const runtime = require('../runtime');
class Package { class Package {
constructor(repo, {author, language, version, checksums, dependencies, size, buildfile, download, signature}){ constructor(repo, {author, language, version, checksums, dependencies, size, buildfile, download, signature}){
this.author = author this.author = author;
this.language = language this.language = language;
this.version = semver.parse(version) this.version = semver.parse(version);
this.checksums = checksums this.checksums = checksums;
this.dependencies = dependencies this.dependencies = dependencies;
this.size = size this.size = size;
this.buildfile = buildfile this.buildfile = buildfile;
this.download = download this.download = download;
this.signature = signature this.signature = signature;
this.repo = repo this.repo = repo;
} }
get installed(){ get installed(){
return fs.exists_sync(path.join(this.install_path, globals.pkg_installed_file)) return fss.exists_sync(path.join(this.install_path, globals.pkg_installed_file));
} }
get download_url(){ get download_url(){
return helpers.add_url_base_if_required(this.download, this.repo.base_u_r_l) return helpers.add_url_base_if_required(this.download, this.repo.base_u_r_l);
} }
get install_path(){ get install_path(){
return path.join(config.data_directory, return path.join(config.data_directory,
globals.data_directories.packages, globals.data_directories.packages,
this.language, this.language,
this.version.raw) this.version.raw);
} }
async install(){ async install(){
if(this.installed) throw new Error("Already installed") if(this.installed) throw new Error('Already installed');
logger.info(`Installing ${this.language}-${this.version.raw}`) logger.info(`Installing ${this.language}-${this.version.raw}`);
if(fs.exists_sync(this.install_path)){ if(fss.exists_sync(this.install_path)){
logger.warn(`${this.language}-${this.version.raw} has residual files. Removing them.`) logger.warn(`${this.language}-${this.version.raw} has residual files. Removing them.`);
await util.promisify(fs.rm)(this.install_path, {recursive: true, force: true}) await fs.rm(this.install_path, {recursive: true, force: true});
} }
logger.debug(`Making directory ${this.install_path}`) logger.debug(`Making directory ${this.install_path}`);
await util.promisify(fs.mkdir)(this.install_path, {recursive: true}) await fs.mkdir(this.install_path, {recursive: true});
logger.debug(`Downloading package from ${this.download_url} in to ${this.install_path}`) logger.debug(`Downloading package from ${this.download_url} in to ${this.install_path}`);
const pkgfile = helpers.url_basename(this.download_url) const pkgfile = helpers.url_basename(this.download_url);
const pkgpath = path.join(this.install_path, pkgfile) const pkgpath = path.join(this.install_path, pkgfile);
await helpers.buffer_from_u_r_l(this.download_url) await helpers.buffer_from_url(this.download_url)
.then(buf=> util.promisify(fs.write_file)(pkgpath, buf)) .then(buf=> fs.write_file(pkgpath, buf));
logger.debug("Validating checksums") logger.debug('Validating checksums');
Object.keys(this.checksums).forEach(algo => { Object.keys(this.checksums).forEach(algo => {
var val = this.checksums[algo] var val = this.checksums[algo];
logger.debug(`Assert ${algo}(${pkgpath}) == ${val}`) logger.debug(`Assert ${algo}(${pkgpath}) == ${val}`);
var cs = crypto.create_hash(algo) var cs = crypto.create_hash(algo)
.update(fs.read_file_sync(pkgpath)) .update(fss.read_file_sync(pkgpath))
.digest("hex") .digest('hex');
if(cs != val) throw new Error(`Checksum miss-match want: ${val} got: ${cs}`) if(cs != val) throw new Error(`Checksum miss-match want: ${val} got: ${cs}`);
}) });
await this.repo.importKeys() await this.repo.import_keys();
logger.debug("Validating signatutes") logger.debug('Validating signatutes');
if(this.signature != '')
await new Promise((resolve,reject)=>{ await new Promise((resolve,reject)=>{
const gpgspawn = cp.spawn("gpg", ["--verify", "-", pkgpath], { const gpgspawn = cp.spawn('gpg', ['--verify', '-', pkgpath], {
stdio: ["pipe", "ignore", "ignore"] stdio: ['pipe', 'ignore', 'ignore']
}) });
gpgspawn.once("exit", (code, _) => { gpgspawn.once('exit', (code, _) => {
if(code == 0) resolve() if(code == 0) resolve();
else reject(new Error("Invalid signature")) else reject(new Error('Invalid signature'));
}) });
gpgspawn.once("error", reject) gpgspawn.once('error', reject);
gpgspawn.stdin.write(this.signature) gpgspawn.stdin.write(this.signature);
gpgspawn.stdin.end() gpgspawn.stdin.end();
}) });
else
logger.warn('Package does not contain a signature - allowing install, but proceed with caution');
logger.debug(`Extracting package files from archive ${pkgfile} in to ${this.install_path}`);
logger.debug(`Extracting package files from archive ${pkgfile} in to ${this.install_path}`)
await new Promise((resolve, reject)=>{ await new Promise((resolve, reject)=>{
const proc = cp.exec(`bash -c 'cd "${this.install_path}" && tar xzf ${pkgfile}'`) const proc = cp.exec(`bash -c 'cd "${this.install_path}" && tar xzf ${pkgfile}'`);
proc.once("exit", (code,_)=>{ proc.once('exit', (code,_)=>{
if(code == 0) resolve() if(code == 0) resolve();
else reject(new Error("Failed to extract package")) else reject(new Error('Failed to extract package'));
}) });
proc.stdout.pipe(process.stdout) proc.stdout.pipe(process.stdout);
proc.stderr.pipe(process.stderr) proc.stderr.pipe(process.stderr);
proc.once("error", reject) proc.once('error', reject);
}) });
logger.debug("Ensuring binary files exist for package") logger.debug('Ensuring binary files exist for package');
const pkgbin = path.join(this.install_path, `${this.language}-${this.version.raw}`) const pkgbin = path.join(this.install_path, `${this.language}-${this.version.raw}`);
try{ try{
const pkgbinstat = await util.promisify(fs.stat)(pkgbin) const pkgbinstat = await fs.stat(pkgbin);
//eslint-disable-next-line snakecasejs/snakecasejs //eslint-disable-next-line snakecasejs/snakecasejs
if(!pkgbinstat.isDirectory()) throw new Error() if(!pkgbinstat.isDirectory()) throw new Error();
}catch(err){ }catch(err){
throw new Error(`Invalid package: could not find ${this.language}-${this.version.raw}/ contained within package files`) throw new Error(`Invalid package: could not find ${this.language}-${this.version.raw}/ contained within package files`);
} }
logger.debug("Symlinking into runtimes") logger.debug('Symlinking into runtimes');
await util.promisify(fs.symlink)( await fs.symlink(
pkgbin, pkgbin,
path.join(config.data_directory, path.join(config.data_directory,
globals.data_directories.runtimes, globals.data_directories.runtimes,
`${this.language}-${this.version.raw}`) `${this.language}-${this.version.raw}`)
).catch((err)=>err) //catch ).catch((err)=>err); //catch
logger.debug("Registering runtime") logger.debug('Registering runtime');
const pkgruntime = new runtime.Runtime(this.install_path) const pkgruntime = new runtime.Runtime(this.install_path);
logger.debug("Caching environment") logger.debug('Caching environment');
const required_pkgs = [pkgruntime, ...pkgruntime.get_all_dependencies()] const required_pkgs = [pkgruntime, ...pkgruntime.get_all_dependencies()];
const get_env_command = [...required_pkgs.map(p=>`cd "${p.runtime_dir}"; source environment; `), const get_env_command = [...required_pkgs.map(p=>`cd "${p.runtime_dir}"; source environment; `),
"env" ].join(" ") 'env' ].join(' ');
const envout = await new Promise((resolve, reject)=>{ const envout = await new Promise((resolve, reject)=>{
var stdout = "" var stdout = '';
const proc = cp.spawn("env",["-i","bash","-c",`${get_env_command}`], { const proc = cp.spawn('env',['-i','bash','-c',`${get_env_command}`], {
stdio: ["ignore", "pipe", "pipe"]}) stdio: ['ignore', 'pipe', 'pipe']});
proc.once("exit", (code,_)=>{ proc.once('exit', (code,_)=>{
if(code == 0) resolve(stdout) if(code == 0) resolve(stdout);
else reject(new Error("Failed to cache environment")) else reject(new Error('Failed to cache environment'));
}) });
proc.stdout.on("data", (data)=>{ proc.stdout.on('data', (data)=>{
stdout += data stdout += data;
}) });
proc.once("error", reject) proc.once('error', reject);
}) });
const filtered_env = envout.split("\n") const filtered_env = envout.split('\n')
.filter(l=>!["PWD","OLDPWD","_", "SHLVL"].includes(l.split("=",2)[0])) .filter(l=>!['PWD','OLDPWD','_', 'SHLVL'].includes(l.split('=',2)[0]))
.join("\n") .join('\n');
await util.promisify(fs.write_file)(path.join(this.install_path, ".env"), filtered_env) await fs.write_file(path.join(this.install_path, '.env'), filtered_env);
logger.debug("Writing installed state to disk") logger.debug('Writing installed state to disk');
await util.promisify(fs.write_file)(path.join(this.install_path, globals.pkg_installed_file), Date.now().toString()) await fs.write_file(path.join(this.install_path, globals.pkg_installed_file), Date.now().toString());
logger.info(`Installed ${this.language}-${this.version.raw}`) logger.info(`Installed ${this.language}-${this.version.raw}`);
return { return {
language: this.language, language: this.language,
version: this.version.raw version: this.version.raw
} };
} }
} }
module.exports = {Package} module.exports = {Package};

View File

@ -1,66 +1,66 @@
const logger = require("logplease").create("ppman/repo") const logger = require('logplease').create('ppman/repo');
const cache = require("../cache") const cache = require('../cache');
const CACHE_CONTEXT = "repo" const CACHE_CONTEXT = 'repo';
const cp = require("child_process") const cp = require('child_process');
const yaml = require("js-yaml") const yaml = require('js-yaml');
const { Package } = require("./package") const { Package } = require('./package');
const helpers = require("../helpers") const helpers = require('../helpers');
class Repository { class Repository {
constructor(slug, url){ constructor(slug, url){
this.slug = slug this.slug = slug;
this.url = new URL(url) this.url = new URL(url);
this.keys = [] this.keys = [];
this.packages = [] this.packages = [];
this.base_u_r_l="" this.base_u_r_l='';
logger.debug(`Created repo slug=${this.slug} url=${this.url}`) logger.debug(`Created repo slug=${this.slug} url=${this.url}`);
} }
get cache_key(){ get cache_key(){
return cache.cache_key(CACHE_CONTEXT, this.slug) return cache.cache_key(CACHE_CONTEXT, this.slug);
} }
async load(){ async load(){
try{ try{
var index = await cache.get(this.cache_key,async ()=>{ var index = await cache.get(this.cache_key,async ()=>{
return helpers.buffer_from_u_r_l(this.url) return helpers.buffer_from_url(this.url);
}) });
var repo = yaml.load(index) var repo = yaml.load(index);
if(repo.schema != "ppman-repo-1"){ if(repo.schema != 'ppman-repo-1'){
throw new Error("YAML Schema unknown") throw new Error('YAML Schema unknown');
} }
this.keys = repo.keys this.keys = repo.keys;
this.packages = repo.packages.map(pkg => new Package(this, pkg)) this.packages = repo.packages.map(pkg => new Package(this, pkg));
this.base_u_r_l = repo.baseurl this.base_u_r_l = repo.baseurl;
}catch(err){ }catch(err){
logger.error(`Failed to load repository ${this.slug}:`,err.message) logger.error(`Failed to load repository ${this.slug}:`,err.message);
} }
} }
async importKeys(){ async import_keys(){
await this.load(); await this.load();
logger.info(`Importing keys for repo ${this.slug}`) logger.info(`Importing keys for repo ${this.slug}`);
await new Promise((resolve,reject)=>{ await new Promise((resolve,reject)=>{
const gpgspawn = cp.spawn("gpg", ['--receive-keys', this.keys], { const gpgspawn = cp.spawn('gpg', ['--receive-keys', this.keys], {
stdio: ["ignore", "ignore", "ignore"] stdio: ['ignore', 'ignore', 'ignore']
}) });
gpgspawn.once("exit", (code, _) => { gpgspawn.once('exit', (code, _) => {
if(code == 0) resolve() if(code == 0) resolve();
else reject(new Error("Failed to import keys")) else reject(new Error('Failed to import keys'));
}) });
gpgspawn.once("error", reject) gpgspawn.once('error', reject);
}) });
} }
} }
module.exports = {Repository} module.exports = {Repository};

View File

@ -1,82 +1,82 @@
const repos = new Map() const repos = new Map();
const state = require("../state") const state = require('../state');
const logger = require("logplease").create("ppman/routes") const logger = require('logplease').create('ppman/routes');
const {Repository} = require("./repo") const {Repository} = require('./repo');
const semver = require("semver") const semver = require('semver');
async function get_or_construct_repo(slug){ async function get_or_construct_repo(slug){
if(repos.has(slug))return repos.get(slug) if(repos.has(slug))return repos.get(slug);
if(state.state.get("repositories").has(slug)){ if(state.state.get('repositories').has(slug)){
const repo_url = state.state.get("repositories").get(slug) const repo_url = state.state.get('repositories').get(slug);
const repo = new Repository(slug, repo_url) const repo = new Repository(slug, repo_url);
await repo.load() await repo.load();
repos.set(slug, repo) repos.set(slug, repo);
return repo return repo;
} }
logger.warn(`Requested repo ${slug} does not exist`) logger.warn(`Requested repo ${slug} does not exist`);
return null return null;
} }
async function get_package(repo, lang, version){ async function get_package(repo, lang, version){
var candidates = repo.packages.filter( var candidates = repo.packages.filter(
pkg => pkg.language == lang && semver.satisfies(pkg.version, version) pkg => pkg.language == lang && semver.satisfies(pkg.version, version)
) );
return candidates.sort((a,b)=>semver.rcompare(a.version,b.version))[0] || null return candidates.sort((a,b)=>semver.rcompare(a.version,b.version))[0] || null;
} }
module.exports = { module.exports = {
async repo_list(req,res){ async repo_list(req,res){
// GET /repos // GET /repos
logger.debug("Request for repoList") logger.debug('Request for repoList');
res.json_success({ res.json_success({
repos: (await Promise.all( repos: (await Promise.all(
[...state.state.get("repositories").keys()].map( async slug => await get_or_construct_repo(slug)) [...state.state.get('repositories').keys()].map( async slug => await get_or_construct_repo(slug))
)).map(repo=>({ )).map(repo=>({
slug: repo.slug, slug: repo.slug,
url: repo.url, url: repo.url,
packages: repo.packages.length packages: repo.packages.length
})) }))
}) });
}, },
async repo_add(req, res){ async repo_add(req, res){
// POST /repos // POST /repos
logger.debug(`Request for repoAdd slug=${req.body.slug} url=${req.body.url}`) logger.debug(`Request for repoAdd slug=${req.body.slug} url=${req.body.url}`);
if(!req.body.slug) if(!req.body.slug)
return res.json_error("slug is missing from request body", 400) return res.json_error('slug is missing from request body', 400);
if(!req.body.url) if(!req.body.url)
return res.json_error("url is missing from request body", 400) return res.json_error('url is missing from request body', 400);
const repo_state = state.state.get("repositories") const repo_state = state.state.get('repositories');
if(repo_state.has(req.body.slug)) return res.json_error(`repository ${req.body.slug} already exists`, 409) if(repo_state.has(req.body.slug)) return res.json_error(`repository ${req.body.slug} already exists`, 409);
repo_state.set(req.body.slug, req.body.url) repo_state.set(req.body.slug, req.body.url);
logger.info(`Repository ${req.body.slug} added url=${req.body.url}`) logger.info(`Repository ${req.body.slug} added url=${req.body.url}`);
return res.json_success(req.body.slug) return res.json_success(req.body.slug);
}, },
async repo_info(req, res){ async repo_info(req, res){
// GET /repos/:slug // GET /repos/:slug
logger.debug(`Request for repoInfo for ${req.params.repo_slug}`) logger.debug(`Request for repoInfo for ${req.params.repo_slug}`);
const repo = await get_or_construct_repo(req.params.repo_slug) const repo = await get_or_construct_repo(req.params.repo_slug);
if(repo == null) return res.json_error(`Requested repo ${req.params.repo_slug} does not exist`, 404) if(repo == null) return res.json_error(`Requested repo ${req.params.repo_slug} does not exist`, 404);
res.json_success({ res.json_success({
slug: repo.slug, slug: repo.slug,
url: repo.url, url: repo.url,
packages: repo.packages.length packages: repo.packages.length
}) });
}, },
async repo_packages(req, res){ async repo_packages(req, res){
// GET /repos/:slug/packages // GET /repos/:slug/packages
logger.debug("Request to repoPackages") logger.debug('Request to repoPackages');
const repo = await get_or_construct_repo(req.params.repo_slug) const repo = await get_or_construct_repo(req.params.repo_slug);
if(repo == null) return res.json_error(`Requested repo ${req.params.repo_slug} does not exist`, 404) if(repo == null) return res.json_error(`Requested repo ${req.params.repo_slug} does not exist`, 404);
res.json_success({ res.json_success({
packages: repo.packages.map(pkg=>({ packages: repo.packages.map(pkg=>({
@ -84,46 +84,46 @@ module.exports = {
language_version: pkg.version.raw, language_version: pkg.version.raw,
installed: pkg.installed installed: pkg.installed
})) }))
}) });
}, },
async package_info(req, res){ async package_info(req, res){
// GET /repos/:slug/packages/:language/:version // GET /repos/:slug/packages/:language/:version
logger.debug("Request to packageInfo") logger.debug('Request to packageInfo');
const repo = await get_or_construct_repo(req.params.repo_slug) const repo = await get_or_construct_repo(req.params.repo_slug);
if(repo == null) return res.json_error(`Requested repo ${req.params.repo_slug} does not exist`, 404) if(repo == null) return res.json_error(`Requested repo ${req.params.repo_slug} does not exist`, 404);
const package = await get_package(repo, req.params.language, req.params.version) const pkg = await get_package(repo, req.params.language, req.params.version);
if(package == null) return res.json_error(`Requested package ${req.params.language}-${req.params.version} does not exist`, 404) if(pkg == null) return res.json_error(`Requested package ${req.params.language}-${req.params.version} does not exist`, 404);
res.json_success({ res.json_success({
language: package.language, language: pkg.language,
language_version: package.version.raw, language_version: pkg.version.raw,
author: package.author, author: pkg.author,
buildfile: package.buildfile, buildfile: pkg.buildfile,
size: package.size, size: pkg.size,
dependencies: package.dependencies, dependencies: pkg.dependencies,
installed: package.installed installed: pkg.installed
}) });
}, },
async package_install(req,res){ async package_install(req,res){
// POST /repos/:slug/packages/:language/:version // POST /repos/:slug/packages/:language/:version
logger.debug("Request to packageInstall") logger.debug('Request to packageInstall');
const repo = await get_or_construct_repo(req.params.repo_slug) const repo = await get_or_construct_repo(req.params.repo_slug);
if(repo == null) return res.json_error(`Requested repo ${req.params.repo_slug} does not exist`, 404) if(repo == null) return res.json_error(`Requested repo ${req.params.repo_slug} does not exist`, 404);
const package = await get_package(repo, req.params.language, req.params.version) const pkg = await get_package(repo, req.params.language, req.params.version);
if(package == null) return res.json_error(`Requested package ${req.params.language}-${req.params.version} does not exist`, 404) if(pkg == null) return res.json_error(`Requested package ${req.params.language}-${req.params.version} does not exist`, 404);
try{ try{
const response = await package.install() const response = await pkg.install();
return res.json_success(response) return res.json_success(response);
}catch(err){ }catch(err){
logger.error(`Error while installing package ${package.language}-${package.version}:`, err.message) logger.error(`Error while installing package ${pkg.language}-${pkg.version}:`, err.message);
res.json_error(err.message,500) res.json_error(err.message,500);
} }
@ -131,6 +131,6 @@ module.exports = {
async package_uninstall(req,res){ async package_uninstall(req,res){
// DELETE /repos/:slug/packages/:language/:version // DELETE /repos/:slug/packages/:language/:version
res.json(req.body) //TODO res.json(req.body); //TODO
}
} }
};

View File

@ -1,85 +1,86 @@
const logger = require("logplease").create("runtime") const logger = require('logplease').create('runtime');
const semver = require("semver") const semver = require('semver');
const config = require("./config") const config = require('./config');
const globals = require("./globals") const globals = require('./globals');
const fs = require("fs") const fss = require('fs');
const path = require("path") const path = require('path');
const runtimes = [] const runtimes = [];
class Runtime { class Runtime {
#env_vars #env_vars
#compiled #compiled
constructor(package_dir){ constructor(package_dir){
const {language, version, author, dependencies, build_platform} = JSON.parse( const {language, version, author, dependencies, build_platform} = JSON.parse(
fs.read_file_sync(path.join(package_dir, "pkg-info.json")) fss.read_file_sync(path.join(package_dir, 'pkg-info.json'))
) );
this.pkgdir = package_dir this.pkgdir = package_dir;
this.language = language this.language = language;
this.version = semver.parse(version) this.version = semver.parse(version);
this.author = author this.author = author;
this.dependencies = dependencies this.dependencies = dependencies;
if(build_platform != globals.platform){ if(build_platform != globals.platform){
logger.warn(`Package ${language}-${version} was built for platform ${build_platform}, but our platform is ${globals.platform}`) logger.warn(`Package ${language}-${version} was built for platform ${build_platform}, but our platform is ${globals.platform}`);
} }
logger.debug(`Package ${language}-${version} was loaded`) logger.debug(`Package ${language}-${version} was loaded`);
runtimes.push(this) runtimes.push(this);
} }
get env_file_path(){ get env_file_path(){
return path.join(this.runtime_dir, "environment") return path.join(this.runtime_dir, 'environment');
} }
get runtime_dir(){ get runtime_dir(){
return path.join(config.data_directory,globals.data_directories.runtimes, this.toString()) return path.join(config.data_directory,globals.data_directories.runtimes, this.toString());
} }
get_all_dependencies(){ get_all_dependencies(){
const res = [] const res = [];
Object.keys(this.dependencies).forEach(dep => { Object.keys(this.dependencies).forEach(dep => {
const selector = this.dependencies[dep] const selector = this.dependencies[dep];
const lang = module.exports.get_latest_runtime_matching_language_version(dep, selector) const lang = module.exports.get_latest_runtime_matching_language_version(dep, selector);
res.push(lang) res.push(lang);
res.concat(lang.get_all_dependencies(lang)) res.concat(lang.get_all_dependencies(lang));
}) });
return res return res;
} }
get compile(){ get compile(){
if(this.#compiled === undefined) this.#compiled = fs.existsSync(path.join(this.pkgdir, "compile")) if(this.#compiled === undefined) this.#compiled = fss.exists_sync(path.join(this.pkgdir, 'compile'));
return this.#compiled return this.#compiled;
} }
get env_vars(){ get env_vars(){
if(!this.#env_vars){ if(!this.#env_vars){
const env_file = path.join(this.pkgdir, ".env") const env_file = path.join(this.pkgdir, '.env');
const env_content = fs.read_file_sync(env_file).toString() const env_content = fss.read_file_sync(env_file).toString();
this.#env_vars = {} this.#env_vars = {};
env_content env_content
.split("\n") .trim()
.map(line => line.split("=",2)) .split('\n')
.map(line => line.split('=',2))
.forEach(([key,val]) => { .forEach(([key,val]) => {
this.#env_vars[key] = val this.#env_vars[key.trim()] = val.trim();
}) });
} }
return this.#env_vars return this.#env_vars;
} }
toString(){ toString(){
return `${this.language}-${this.version.raw}` return `${this.language}-${this.version.raw}`;
} }
} }
module.exports = runtimes module.exports = runtimes;
module.exports.Runtime = Runtime module.exports.Runtime = Runtime;
module.exports.get_runtimes_matching_language_version = function(lang, ver){ module.exports.get_runtimes_matching_language_version = function(lang, ver){
return runtimes.filter(rt => rt.language == lang && semver.satisfies(rt.version, ver)) return runtimes.filter(rt => rt.language == lang && semver.satisfies(rt.version, ver));
} };
module.exports.get_latest_runtime_matching_language_version = function(lang, ver){ module.exports.get_latest_runtime_matching_language_version = function(lang, ver){
return module.exports.get_runtimes_matching_language_version(lang, ver) return module.exports.get_runtimes_matching_language_version(lang, ver)
.sort((a,b) => semver.rcompare(a.version, b.version))[0] .sort((a,b) => semver.rcompare(a.version, b.version))[0];
} };

View File

@ -1,45 +1,45 @@
const fs = require("fs") const fs = require('fs/promises');
const util = require("util") const fss = require('fs');
const logger = require("logplease").create("state") const logger = require('logplease').create('state');
const state = new Map() const state = new Map();
function replacer(key, value) { function replacer(key, value) {
if(value instanceof Map) { if(value instanceof Map) {
return { return {
data_type: "Map", data_type: 'Map',
value: Array.from(value.entries()), value: Array.from(value.entries()),
} };
} else { } else {
return value return value;
} }
} }
function reviver(key, value) { function reviver(key, value) {
if(typeof value === "object" && value !== null) { if(typeof value === 'object' && value !== null) {
if (value.data_type === "Map") { if (value.data_type === 'Map') {
return new Map(value.value) return new Map(value.value);
} }
} }
return value return value;
} }
module.exports = { module.exports = {
state, state,
async load(data_file){ async load(data_file){
if(fs.exists_sync(data_file)){ if(fss.exists_sync(data_file)){
logger.info("Loading state from file") logger.info('Loading state from file');
var content = await util.promisify(fs.read_file)(data_file) var content = await fs.read_file(data_file);
var obj = JSON.parse(content.toString(), reviver); var obj = JSON.parse(content.toString(), reviver);
[...obj.keys()].forEach(k => state.set(k, obj.get(k))) [...obj.keys()].forEach(k => state.set(k, obj.get(k)));
}else{ }else{
logger.info("Creating new statefile") logger.info('Creating new statefile');
state.set("repositories", new Map().set("offical", "https://repo.pistonee.org/index.yaml")) state.set('repositories', new Map().set('offical', 'https://repo.pistonee.org/index.yaml'));
} }
}, },
async save(data_file){ async save(data_file){
logger.info("Saving state to disk") logger.info('Saving state to disk');
await util.promisify(fs.write_file)(data_file, JSON.stringify(state, replacer)) await fs.write_file(data_file, JSON.stringify(state, replacer));
}
} }
};

View File

@ -9,12 +9,51 @@
dependencies: dependencies:
"@babel/highlight" "^7.10.4" "@babel/highlight" "^7.10.4"
"@babel/code-frame@^7.0.0", "@babel/code-frame@^7.12.13":
version "7.12.13"
resolved "https://registry.yarnpkg.com/@babel/code-frame/-/code-frame-7.12.13.tgz#dcfc826beef65e75c50e21d3837d7d95798dd658"
integrity sha512-HV1Cm0Q3ZrpCR93tkWOYiuYIgLxZXZFVG2VgK+MBWjUqZTundupbfx2aXarXuw5Ko5aMcjtJgbSs4vUGBS5v6g==
dependencies:
"@babel/highlight" "^7.12.13"
"@babel/generator@^7.12.17":
version "7.12.17"
resolved "https://registry.yarnpkg.com/@babel/generator/-/generator-7.12.17.tgz#9ef1dd792d778b32284411df63f4f668a9957287"
integrity sha512-DSA7ruZrY4WI8VxuS1jWSRezFnghEoYEFrZcw9BizQRmOZiUsiHl59+qEARGPqPikwA/GPTyRCi7isuCK/oyqg==
dependencies:
"@babel/types" "^7.12.17"
jsesc "^2.5.1"
source-map "^0.5.0"
"@babel/helper-function-name@^7.12.13":
version "7.12.13"
resolved "https://registry.yarnpkg.com/@babel/helper-function-name/-/helper-function-name-7.12.13.tgz#93ad656db3c3c2232559fd7b2c3dbdcbe0eb377a"
integrity sha512-TZvmPn0UOqmvi5G4vvw0qZTpVptGkB1GL61R6lKvrSdIxGm5Pky7Q3fpKiIkQCAtRCBUwB0PaThlx9vebCDSwA==
dependencies:
"@babel/helper-get-function-arity" "^7.12.13"
"@babel/template" "^7.12.13"
"@babel/types" "^7.12.13"
"@babel/helper-get-function-arity@^7.12.13":
version "7.12.13"
resolved "https://registry.yarnpkg.com/@babel/helper-get-function-arity/-/helper-get-function-arity-7.12.13.tgz#bc63451d403a3b3082b97e1d8b3fe5bd4091e583"
integrity sha512-DjEVzQNz5LICkzN0REdpD5prGoidvbdYk1BVgRUOINaWJP2t6avB27X1guXK1kXNrX0WMfsrm1A/ZBthYuIMQg==
dependencies:
"@babel/types" "^7.12.13"
"@babel/helper-split-export-declaration@^7.12.13":
version "7.12.13"
resolved "https://registry.yarnpkg.com/@babel/helper-split-export-declaration/-/helper-split-export-declaration-7.12.13.tgz#e9430be00baf3e88b0e13e6f9d4eaf2136372b05"
integrity sha512-tCJDltF83htUtXx5NLcaDqRmknv652ZWCHyoTETf1CXYJdPC7nohZohjUgieXhv0hTJdRf2FjDueFehdNucpzg==
dependencies:
"@babel/types" "^7.12.13"
"@babel/helper-validator-identifier@^7.12.11": "@babel/helper-validator-identifier@^7.12.11":
version "7.12.11" version "7.12.11"
resolved "https://registry.yarnpkg.com/@babel/helper-validator-identifier/-/helper-validator-identifier-7.12.11.tgz#c9a1f021917dcb5ccf0d4e453e399022981fc9ed" resolved "https://registry.yarnpkg.com/@babel/helper-validator-identifier/-/helper-validator-identifier-7.12.11.tgz#c9a1f021917dcb5ccf0d4e453e399022981fc9ed"
integrity sha512-np/lG3uARFybkoHokJUmf1QfEvRVCPbmQeUQpKow5cQ3xWrV9i3rUHodKDJPQfTVX61qKi+UdYk8kik84n7XOw== integrity sha512-np/lG3uARFybkoHokJUmf1QfEvRVCPbmQeUQpKow5cQ3xWrV9i3rUHodKDJPQfTVX61qKi+UdYk8kik84n7XOw==
"@babel/highlight@^7.10.4": "@babel/highlight@^7.10.4", "@babel/highlight@^7.12.13":
version "7.12.13" version "7.12.13"
resolved "https://registry.yarnpkg.com/@babel/highlight/-/highlight-7.12.13.tgz#8ab538393e00370b26271b01fa08f7f27f2e795c" resolved "https://registry.yarnpkg.com/@babel/highlight/-/highlight-7.12.13.tgz#8ab538393e00370b26271b01fa08f7f27f2e795c"
integrity sha512-kocDQvIbgMKlWxXe9fof3TQ+gkIPOUSEYhJjqUjvKMez3krV7vbzYCDq39Oj11UAVK7JqPVGQPlgE85dPNlQww== integrity sha512-kocDQvIbgMKlWxXe9fof3TQ+gkIPOUSEYhJjqUjvKMez3krV7vbzYCDq39Oj11UAVK7JqPVGQPlgE85dPNlQww==
@ -23,6 +62,44 @@
chalk "^2.0.0" chalk "^2.0.0"
js-tokens "^4.0.0" js-tokens "^4.0.0"
"@babel/parser@^7.12.13", "@babel/parser@^7.12.17", "@babel/parser@^7.7.0":
version "7.12.17"
resolved "https://registry.yarnpkg.com/@babel/parser/-/parser-7.12.17.tgz#bc85d2d47db38094e5bb268fc761716e7d693848"
integrity sha512-r1yKkiUTYMQ8LiEI0UcQx5ETw5dpTLn9wijn9hk6KkTtOK95FndDN10M+8/s6k/Ymlbivw0Av9q4SlgF80PtHg==
"@babel/template@^7.12.13":
version "7.12.13"
resolved "https://registry.yarnpkg.com/@babel/template/-/template-7.12.13.tgz#530265be8a2589dbb37523844c5bcb55947fb327"
integrity sha512-/7xxiGA57xMo/P2GVvdEumr8ONhFOhfgq2ihK3h1e6THqzTAkHbkXgB0xI9yeTfIUoH3+oAeHhqm/I43OTbbjA==
dependencies:
"@babel/code-frame" "^7.12.13"
"@babel/parser" "^7.12.13"
"@babel/types" "^7.12.13"
"@babel/traverse@^7.7.0":
version "7.12.17"
resolved "https://registry.yarnpkg.com/@babel/traverse/-/traverse-7.12.17.tgz#40ec8c7ffb502c4e54c7f95492dc11b88d718619"
integrity sha512-LGkTqDqdiwC6Q7fWSwQoas/oyiEYw6Hqjve5KOSykXkmFJFqzvGMb9niaUEag3Rlve492Mkye3gLw9FTv94fdQ==
dependencies:
"@babel/code-frame" "^7.12.13"
"@babel/generator" "^7.12.17"
"@babel/helper-function-name" "^7.12.13"
"@babel/helper-split-export-declaration" "^7.12.13"
"@babel/parser" "^7.12.17"
"@babel/types" "^7.12.17"
debug "^4.1.0"
globals "^11.1.0"
lodash "^4.17.19"
"@babel/types@^7.12.13", "@babel/types@^7.12.17", "@babel/types@^7.7.0":
version "7.12.17"
resolved "https://registry.yarnpkg.com/@babel/types/-/types-7.12.17.tgz#9d711eb807e0934c90b8b1ca0eb1f7230d150963"
integrity sha512-tNMDjcv/4DIcHxErTgwB9q2ZcYyN0sUfgGKUK/mm1FJK7Wz+KstoEekxrl/tBiNDgLK1HGi+sppj1An/1DR4fQ==
dependencies:
"@babel/helper-validator-identifier" "^7.12.11"
lodash "^4.17.19"
to-fast-properties "^2.0.0"
"@eslint/eslintrc@^0.3.0": "@eslint/eslintrc@^0.3.0":
version "0.3.0" version "0.3.0"
resolved "https://registry.yarnpkg.com/@eslint/eslintrc/-/eslintrc-0.3.0.tgz#d736d6963d7003b6514e6324bec9c602ac340318" resolved "https://registry.yarnpkg.com/@eslint/eslintrc/-/eslintrc-0.3.0.tgz#d736d6963d7003b6514e6324bec9c602ac340318"
@ -123,6 +200,18 @@ astral-regex@^2.0.0:
resolved "https://registry.yarnpkg.com/astral-regex/-/astral-regex-2.0.0.tgz#483143c567aeed4785759c0865786dc77d7d2e31" resolved "https://registry.yarnpkg.com/astral-regex/-/astral-regex-2.0.0.tgz#483143c567aeed4785759c0865786dc77d7d2e31"
integrity sha512-Z7tMw1ytTXt5jqMcOP+OQteU1VuNK9Y02uuJtKQ1Sv69jXQKKg5cibLwGJow8yzZP+eAc18EmLGPal0bp36rvQ== integrity sha512-Z7tMw1ytTXt5jqMcOP+OQteU1VuNK9Y02uuJtKQ1Sv69jXQKKg5cibLwGJow8yzZP+eAc18EmLGPal0bp36rvQ==
babel-eslint@^10.1.0:
version "10.1.0"
resolved "https://registry.yarnpkg.com/babel-eslint/-/babel-eslint-10.1.0.tgz#6968e568a910b78fb3779cdd8b6ac2f479943232"
integrity sha512-ifWaTHQ0ce+448CYop8AdrQiBsGrnC+bMgfyKFdi6EsPLTAWG+QfyDeM6OH+FmWnKvEq5NnBMLvlBUPKQZoDSg==
dependencies:
"@babel/code-frame" "^7.0.0"
"@babel/parser" "^7.7.0"
"@babel/traverse" "^7.7.0"
"@babel/types" "^7.7.0"
eslint-visitor-keys "^1.0.0"
resolve "^1.12.0"
balanced-match@^1.0.0: balanced-match@^1.0.0:
version "1.0.0" version "1.0.0"
resolved "https://registry.yarnpkg.com/balanced-match/-/balanced-match-1.0.0.tgz#89b4d199ab2bee49de164ea02b89ce462d71b767" resolved "https://registry.yarnpkg.com/balanced-match/-/balanced-match-1.0.0.tgz#89b4d199ab2bee49de164ea02b89ce462d71b767"
@ -255,7 +344,7 @@ debug@2.6.9:
dependencies: dependencies:
ms "2.0.0" ms "2.0.0"
debug@^4.0.1, debug@^4.1.1: debug@^4.0.1, debug@^4.1.0, debug@^4.1.1:
version "4.3.1" version "4.3.1"
resolved "https://registry.yarnpkg.com/debug/-/debug-4.3.1.tgz#f0d229c505e0c6d8c49ac553d1b13dc183f6b2ee" resolved "https://registry.yarnpkg.com/debug/-/debug-4.3.1.tgz#f0d229c505e0c6d8c49ac553d1b13dc183f6b2ee"
integrity sha512-doEwdvm4PCeK4K3RQN2ZC2BYUBaxwLARCqZmMjtF8a51J2Rb0xpVloFRnCODwqjpwnAoao4pelN8l3RJdv3gRQ== integrity sha512-doEwdvm4PCeK4K3RQN2ZC2BYUBaxwLARCqZmMjtF8a51J2Rb0xpVloFRnCODwqjpwnAoao4pelN8l3RJdv3gRQ==
@ -341,7 +430,7 @@ eslint-utils@^2.1.0:
dependencies: dependencies:
eslint-visitor-keys "^1.1.0" eslint-visitor-keys "^1.1.0"
eslint-visitor-keys@^1.1.0, eslint-visitor-keys@^1.3.0: eslint-visitor-keys@^1.0.0, eslint-visitor-keys@^1.1.0, eslint-visitor-keys@^1.3.0:
version "1.3.0" version "1.3.0"
resolved "https://registry.yarnpkg.com/eslint-visitor-keys/-/eslint-visitor-keys-1.3.0.tgz#30ebd1ef7c2fdff01c3a4f151044af25fab0523e" resolved "https://registry.yarnpkg.com/eslint-visitor-keys/-/eslint-visitor-keys-1.3.0.tgz#30ebd1ef7c2fdff01c3a4f151044af25fab0523e"
integrity sha512-6J72N8UNa462wa/KFODt/PJ3IU60SDpC3QXC1Hjc1BXXpfL2C9R5+AU7jhe0F6GREqVMh4Juu+NY7xn+6dipUQ== integrity sha512-6J72N8UNa462wa/KFODt/PJ3IU60SDpC3QXC1Hjc1BXXpfL2C9R5+AU7jhe0F6GREqVMh4Juu+NY7xn+6dipUQ==
@ -541,6 +630,11 @@ fs.realpath@^1.0.0:
resolved "https://registry.yarnpkg.com/fs.realpath/-/fs.realpath-1.0.0.tgz#1504ad2523158caa40db4a2787cb01411994ea4f" resolved "https://registry.yarnpkg.com/fs.realpath/-/fs.realpath-1.0.0.tgz#1504ad2523158caa40db4a2787cb01411994ea4f"
integrity sha1-FQStJSMVjKpA20onh8sBQRmU6k8= integrity sha1-FQStJSMVjKpA20onh8sBQRmU6k8=
function-bind@^1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/function-bind/-/function-bind-1.1.1.tgz#a56899d3ea3c9bab874bb9773b7c5ede92f4895d"
integrity sha512-yIovAzMX49sF8Yl58fSCWJ5svSLuaibPxXQJFLmBObTuCr0Mf1KiPopGM9NiFjiYBCbfaa2Fh6breQ6ANVTI0A==
functional-red-black-tree@^1.0.1: functional-red-black-tree@^1.0.1:
version "1.0.1" version "1.0.1"
resolved "https://registry.yarnpkg.com/functional-red-black-tree/-/functional-red-black-tree-1.0.1.tgz#1b0ab3bd553b2a0d6399d29c0e3ea0b252078327" resolved "https://registry.yarnpkg.com/functional-red-black-tree/-/functional-red-black-tree-1.0.1.tgz#1b0ab3bd553b2a0d6399d29c0e3ea0b252078327"
@ -570,6 +664,11 @@ glob@^7.1.3:
once "^1.3.0" once "^1.3.0"
path-is-absolute "^1.0.0" path-is-absolute "^1.0.0"
globals@^11.1.0:
version "11.12.0"
resolved "https://registry.yarnpkg.com/globals/-/globals-11.12.0.tgz#ab8795338868a0babd8525758018c2a7eb95c42e"
integrity sha512-WOBp/EEGUiIsJSp7wcv/y6MO+lV9UoncWqxuFfm8eBwzWNgyfBd6Gz+IeKQ9jCmyhoH99g15M3T+QaVHFjizVA==
globals@^12.1.0: globals@^12.1.0:
version "12.4.0" version "12.4.0"
resolved "https://registry.yarnpkg.com/globals/-/globals-12.4.0.tgz#a18813576a41b00a24a97e7f815918c2e19925f8" resolved "https://registry.yarnpkg.com/globals/-/globals-12.4.0.tgz#a18813576a41b00a24a97e7f815918c2e19925f8"
@ -587,6 +686,13 @@ has-flag@^4.0.0:
resolved "https://registry.yarnpkg.com/has-flag/-/has-flag-4.0.0.tgz#944771fd9c81c81265c4d6941860da06bb59479b" resolved "https://registry.yarnpkg.com/has-flag/-/has-flag-4.0.0.tgz#944771fd9c81c81265c4d6941860da06bb59479b"
integrity sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ== integrity sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==
has@^1.0.3:
version "1.0.3"
resolved "https://registry.yarnpkg.com/has/-/has-1.0.3.tgz#722d7cbfc1f6aa8241f16dd814e011e1f41e8796"
integrity sha512-f2dvO0VU6Oej7RkWJGrehjbzMAjFp5/VKPp5tTpWIV4JHHZK1/BxbFRtf/siA2SWTe09caDmVtYYzWEIbBS4zw==
dependencies:
function-bind "^1.1.1"
http-errors@1.7.2: http-errors@1.7.2:
version "1.7.2" version "1.7.2"
resolved "https://registry.yarnpkg.com/http-errors/-/http-errors-1.7.2.tgz#4f5029cf13239f31036e5b2e55292bcfbcc85c8f" resolved "https://registry.yarnpkg.com/http-errors/-/http-errors-1.7.2.tgz#4f5029cf13239f31036e5b2e55292bcfbcc85c8f"
@ -657,6 +763,13 @@ ipaddr.js@1.9.1:
resolved "https://registry.yarnpkg.com/ipaddr.js/-/ipaddr.js-1.9.1.tgz#bff38543eeb8984825079ff3a2a8e6cbd46781b3" resolved "https://registry.yarnpkg.com/ipaddr.js/-/ipaddr.js-1.9.1.tgz#bff38543eeb8984825079ff3a2a8e6cbd46781b3"
integrity sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g== integrity sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==
is-core-module@^2.2.0:
version "2.2.0"
resolved "https://registry.yarnpkg.com/is-core-module/-/is-core-module-2.2.0.tgz#97037ef3d52224d85163f5597b2b63d9afed981a"
integrity sha512-XRAfAdyyY5F5cOXn7hYQDqh2Xmii+DEfIcQGxK/uNwMHhIkPWO0g8msXcbzLe+MpGoR951MlqM/2iIlU4vKDdQ==
dependencies:
has "^1.0.3"
is-docker@^2.1.1: is-docker@^2.1.1:
version "2.1.1" version "2.1.1"
resolved "https://registry.yarnpkg.com/is-docker/-/is-docker-2.1.1.tgz#4125a88e44e450d384e09047ede71adc2d144156" resolved "https://registry.yarnpkg.com/is-docker/-/is-docker-2.1.1.tgz#4125a88e44e450d384e09047ede71adc2d144156"
@ -704,6 +817,11 @@ js-yaml@^4.0.0:
dependencies: dependencies:
argparse "^2.0.1" argparse "^2.0.1"
jsesc@^2.5.1:
version "2.5.2"
resolved "https://registry.yarnpkg.com/jsesc/-/jsesc-2.5.2.tgz#80564d2e483dacf6e8ef209650a67df3f0c283a4"
integrity sha512-OYu7XEzjkCQ3C5Ps3QIZsQfNpqoJyZZA99wd9aWd05NCtC5pWOkShK2mkL6HXQR6/Cy2lbNdPlZBpuQHXE63gA==
json-schema-traverse@^0.4.1: json-schema-traverse@^0.4.1:
version "0.4.1" version "0.4.1"
resolved "https://registry.yarnpkg.com/json-schema-traverse/-/json-schema-traverse-0.4.1.tgz#69f6a87d9513ab8bb8fe63bdb0979c448e684660" resolved "https://registry.yarnpkg.com/json-schema-traverse/-/json-schema-traverse-0.4.1.tgz#69f6a87d9513ab8bb8fe63bdb0979c448e684660"
@ -727,6 +845,11 @@ levn@^0.4.1:
prelude-ls "^1.2.1" prelude-ls "^1.2.1"
type-check "~0.4.0" type-check "~0.4.0"
lodash@^4.17.19:
version "4.17.21"
resolved "https://registry.yarnpkg.com/lodash/-/lodash-4.17.21.tgz#679591c564c3bffaae8454cf0b3df370c3d6911c"
integrity sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==
lodash@^4.17.20: lodash@^4.17.20:
version "4.17.20" version "4.17.20"
resolved "https://registry.yarnpkg.com/lodash/-/lodash-4.17.20.tgz#b44a9b6297bcb698f1c51a3545a2b3b368d59c52" resolved "https://registry.yarnpkg.com/lodash/-/lodash-4.17.20.tgz#b44a9b6297bcb698f1c51a3545a2b3b368d59c52"
@ -808,10 +931,9 @@ negotiator@0.6.2:
resolved "https://registry.yarnpkg.com/negotiator/-/negotiator-0.6.2.tgz#feacf7ccf525a77ae9634436a64883ffeca346fb" resolved "https://registry.yarnpkg.com/negotiator/-/negotiator-0.6.2.tgz#feacf7ccf525a77ae9634436a64883ffeca346fb"
integrity sha512-hZXc7K2e+PgeI1eDBe/10Ard4ekbfrrqG8Ep+8Jmf4JID2bNg7NvCPOZN+kfF574pFQI7mum2AUqDidoKqcTOw== integrity sha512-hZXc7K2e+PgeI1eDBe/10Ard4ekbfrrqG8Ep+8Jmf4JID2bNg7NvCPOZN+kfF574pFQI7mum2AUqDidoKqcTOw==
nocamel@*: nocamel@HexF/nocamel#patch-1:
version "1.0.2" version "1.1.0"
resolved "https://registry.yarnpkg.com/nocamel/-/nocamel-1.0.2.tgz#13ff04ffacd5487ba65555c0dcafcf8c95c918ba" resolved "https://codeload.github.com/HexF/nocamel/tar.gz/89a5bfbbd07c72c302d968b967d0f4fe54846544"
integrity sha512-CRkRSRLChj+H6e4lHS851QS6YGCoTETnSG/z+XGanxLSsTbBkvEeIWaIYMKzuBznFwWM0YcLGXsFyXg4xWYnWA==
node-fetch@^2.6.1: node-fetch@^2.6.1:
version "2.6.1" version "2.6.1"
@ -866,6 +988,11 @@ path-key@^3.1.0:
resolved "https://registry.yarnpkg.com/path-key/-/path-key-3.1.1.tgz#581f6ade658cbba65a0d3380de7753295054f375" resolved "https://registry.yarnpkg.com/path-key/-/path-key-3.1.1.tgz#581f6ade658cbba65a0d3380de7753295054f375"
integrity sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q== integrity sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q==
path-parse@^1.0.6:
version "1.0.6"
resolved "https://registry.yarnpkg.com/path-parse/-/path-parse-1.0.6.tgz#d62dbb5679405d72c4737ec58600e9ddcf06d24c"
integrity sha512-GSmOT2EbHrINBf9SR7CDELwlJ8AENk3Qn7OikK4nFYAu3Ote2+JYNVvkpAEQm3/TLNEJFD/xZJjzyxg3KBWOzw==
path-to-regexp@0.1.7: path-to-regexp@0.1.7:
version "0.1.7" version "0.1.7"
resolved "https://registry.yarnpkg.com/path-to-regexp/-/path-to-regexp-0.1.7.tgz#df604178005f522f15eb4490e7247a1bfaa67f8c" resolved "https://registry.yarnpkg.com/path-to-regexp/-/path-to-regexp-0.1.7.tgz#df604178005f522f15eb4490e7247a1bfaa67f8c"
@ -934,6 +1061,14 @@ resolve-from@^4.0.0:
resolved "https://registry.yarnpkg.com/resolve-from/-/resolve-from-4.0.0.tgz#4abcd852ad32dd7baabfe9b40e00a36db5f392e6" resolved "https://registry.yarnpkg.com/resolve-from/-/resolve-from-4.0.0.tgz#4abcd852ad32dd7baabfe9b40e00a36db5f392e6"
integrity sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g== integrity sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g==
resolve@^1.12.0:
version "1.20.0"
resolved "https://registry.yarnpkg.com/resolve/-/resolve-1.20.0.tgz#629a013fb3f70755d6f0b7935cc1c2c5378b1975"
integrity sha512-wENBPt4ySzg4ybFQW2TT1zMQucPK95HSh/nq2CFTZVOGut2+pQvSsgtda4d26YrYcr067wjbmzOG8byDPBX63A==
dependencies:
is-core-module "^2.2.0"
path-parse "^1.0.6"
rimraf@^3.0.2: rimraf@^3.0.2:
version "3.0.2" version "3.0.2"
resolved "https://registry.yarnpkg.com/rimraf/-/rimraf-3.0.2.tgz#f1a5402ba6220ad52cc1282bac1ae3aa49fd061a" resolved "https://registry.yarnpkg.com/rimraf/-/rimraf-3.0.2.tgz#f1a5402ba6220ad52cc1282bac1ae3aa49fd061a"
@ -1013,6 +1148,11 @@ slice-ansi@^4.0.0:
astral-regex "^2.0.0" astral-regex "^2.0.0"
is-fullwidth-code-point "^3.0.0" is-fullwidth-code-point "^3.0.0"
source-map@^0.5.0:
version "0.5.7"
resolved "https://registry.yarnpkg.com/source-map/-/source-map-0.5.7.tgz#8a039d2d1021d22d1ea14c80d8ea468ba2ef3fcc"
integrity sha1-igOdLRAh0i0eoUyA2OpGi6LvP8w=
sprintf-js@~1.0.2: sprintf-js@~1.0.2:
version "1.0.3" version "1.0.3"
resolved "https://registry.yarnpkg.com/sprintf-js/-/sprintf-js-1.0.3.tgz#04e6926f662895354f3dd015203633b857297e2c" resolved "https://registry.yarnpkg.com/sprintf-js/-/sprintf-js-1.0.3.tgz#04e6926f662895354f3dd015203633b857297e2c"
@ -1073,6 +1213,11 @@ text-table@^0.2.0:
resolved "https://registry.yarnpkg.com/text-table/-/text-table-0.2.0.tgz#7f5ee823ae805207c00af2df4a84ec3fcfa570b4" resolved "https://registry.yarnpkg.com/text-table/-/text-table-0.2.0.tgz#7f5ee823ae805207c00af2df4a84ec3fcfa570b4"
integrity sha1-f17oI66AUgfACvLfSoTsP8+lcLQ= integrity sha1-f17oI66AUgfACvLfSoTsP8+lcLQ=
to-fast-properties@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/to-fast-properties/-/to-fast-properties-2.0.0.tgz#dc5e698cbd079265bc73e0377681a4e4e83f616e"
integrity sha1-3F5pjL0HkmW8c+A3doGk5Og/YW4=
toidentifier@1.0.0: toidentifier@1.0.0:
version "1.0.0" version "1.0.0"
resolved "https://registry.yarnpkg.com/toidentifier/-/toidentifier-1.0.0.tgz#7e1be3470f1e77948bc43d94a3c8f4d7752ba553" resolved "https://registry.yarnpkg.com/toidentifier/-/toidentifier-1.0.0.tgz#7e1be3470f1e77948bc43d94a3c8f4d7752ba553"

25
docker-compose.yaml Normal file
View File

@ -0,0 +1,25 @@
version: '3.8'
services:
piston_api:
build: api
restart: always
ports:
- 6969:6969
volumes:
- ./data/piston:/piston
- ./repo:/repo
tmpfs:
- /piston/cache
- /piston/jobs
piston_fs_repo: #Temporary solution until CI works
build: repo
command: >
bash -c '/repo/make.sh &&
curl http://piston_api:6969/repos -XPOST -d "slug=local&url=file:///repo/index.yaml";
echo -e "\nAn error here is fine, it just means its already added it. Perhaps you restarted this container"
'
volumes:
- ./repo:/repo
- ./packages:/packages

View File

@ -0,0 +1,32 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 24.2.3, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1" id="Logo" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
viewBox="0 0 1024 1024" style="enable-background:new 0 0 1024 1024;" xml:space="preserve">
<style type="text/css">
.st0{fill:url(#Circle_1_);}
.st1{fill:#00E300;}
.st2{fill:#FFFFFF;}
</style>
<radialGradient id="Circle_1_" cx="512" cy="512" r="512" gradientUnits="userSpaceOnUse">
<stop offset="0" style="stop-color:#0B0B0B"/>
<stop offset="1" style="stop-color:#1B1B1B"/>
</radialGradient>
<path id="Circle" class="st0" d="M512,1024L512,1024c282.77,0,512-229.23,512-512v0C1024,229.23,794.77,0,512,0h0
C229.23,0,0,229.23,0,512v0C0,794.77,229.23,1024,512,1024z"/>
<g id="E">
<path class="st1" d="M218,840.5c-4.31,0-8.4-1.92-11.15-5.23c-2.75-3.32-3.89-7.68-3.1-11.92l117-628
c1.28-6.87,7.27-11.84,14.25-11.84h427c3.58,0,7.03,1.32,9.69,3.72l99,89c4.47,4.01,5.99,10.37,3.84,15.97
c-2.15,5.61-7.53,9.31-13.54,9.31H481.02l-28.52,151H541c3.58,0,7.03,1.32,9.69,3.72l99,89c4.47,4.01,5.99,10.37,3.84,15.97
c-2.15,5.61-7.53,9.31-13.54,9.31H430.03l-28.55,152H771c6.37,0,11.99,4.15,13.86,10.24c1.87,6.08-0.45,12.68-5.71,16.26l-131,89
c-2.4,1.63-5.24,2.51-8.15,2.51H218z"/>
<path d="M762,198l99,89H469l-34,180h106l99,89H418l-34,181h387l-131,89H218l117-628H762 M762,169H335
c-13.97,0-25.95,9.96-28.51,23.69l-117,628c-1.58,8.48,0.69,17.21,6.2,23.84C201.2,851.16,209.38,855,218,855h422
c5.81,0,11.49-1.75,16.3-5.01l131-89c10.53-7.16,15.16-20.34,11.42-32.51C794.98,716.3,783.73,708,771,708H418.95l23.1-123H640
c12.01,0,22.78-7.4,27.08-18.62c4.3-11.21,1.24-23.92-7.69-31.95l-99-89c-5.32-4.79-12.23-7.43-19.39-7.43h-71.01l23.04-122H861
c12.01,0,22.78-7.4,27.08-18.62c4.3-11.21,1.24-23.92-7.69-31.95l-99-89C776.06,171.65,769.16,169,762,169L762,169z"/>
</g>
<g id="Highlight">
<polygon id="Top" class="st2" points="640,556 541,467 435,467 469,287 645.78,287 632.79,198 378,198 430.19,556 "/>
<polygon id="Bottom" class="st2" points="456.57,737 469.55,826 640,826 716.83,773.81 711.45,737 "/>
</g>
</svg>

After

Width:  |  Height:  |  Size: 2.1 KiB

View File

@ -26,11 +26,14 @@ pkg-info.jq:
$(foreach dep, ${LANG_DEPS}, echo '.dependencies.$(word 1,$(subst =, ,${dep}))="$(word 2,$(subst =, ,${dep}))"' >> pkg-info.jq) $(foreach dep, ${LANG_DEPS}, echo '.dependencies.$(word 1,$(subst =, ,${dep}))="$(word 2,$(subst =, ,${dep}))"' >> pkg-info.jq)
%.asc: % %.asc: %
gpg --detach-sig --armor --output $@ $< gpg --detach-sig --armor --batch --output $@ $<
%/: %.tgz %/: %.tgz
tar xzf $< tar xzf $<
%/: %.tar.gz
tar xzf $<
.PHONY: clean .PHONY: clean
clean: clean:
rm -rf $(filter-out Makefile, $(wildcard *)) rm -rf $(filter-out Makefile, $(wildcard *))

View File

@ -8,12 +8,12 @@ run:
echo 'python$(shell grep -oP "\d+.\d+"<<<${VERSION}) $$*' > run echo 'python$(shell grep -oP "\d+.\d+"<<<${VERSION}) $$*' > run
${NAME}-${VERSION}/environment: ${NAME}-${VERSION}/environment:
echo 'export PATH=$$PWD/${NAME}-${VERSION}/bin:$$PATH' > $@ echo 'export PATH=$$PWD/bin:$$PATH' > $@
${NAME}-${VERSION}/: Python-${VERSION}/ ${NAME}-${VERSION}/: Python-${VERSION}/
cd $< && ./configure --prefix / cd $< && ./configure --prefix /
$(MAKE) -j$(or ${MAKE_JOBS},64) -C $< $(MAKE) -j$(or ${MAKE_JOBS},64) -C $<
DESTDIR=../$@ $(MAKE) -j$(or ${MAKE_JOBS},64) -C $< altinstall || true DESTDIR=../$@ $(MAKE) -j$(or ${MAKE_JOBS},64) -C $< altinstall || true
${NAME}-${VERSION}.tgz: Python-${VERSION}.tgz:
curl "https://www.python.org/ftp/python/${VERSION}/$@" -o $@ curl "https://www.python.org/ftp/python/${VERSION}/$@" -o $@

View File

@ -15,8 +15,9 @@ cleanup: $(patsubst %,%/cleanup,${VERSIONS})
%/${LANGUAGE}-%.pkg.tar.gz: %/Makefile %/${LANGUAGE}-%.pkg.tar.gz: %/Makefile
$(MAKE) -C $(shell dirname $<) $(MAKE) -C $(shell dirname $<)
%/Makefile: %/Makefile:
@mkdir -p $(shell dirname $@) @mkdir -p $(shell dirname $@)
@echo 'VERSION=$(patsubst %/Makefile,%,$@)' > $@ @echo 'VERSION=$(patsubst %/Makefile,%,$@)' > $@
@echo 'NAME=${LANGUAGE}' > $@ @echo 'NAME=${LANGUAGE}' >> $@
@echo 'include ../base.mk' >> $@ @echo 'include ../base.mk' >> $@

3
repo/.gitignore vendored Normal file
View File

@ -0,0 +1,3 @@
*.pkg.tar.gz
index.yaml
*.key

7
repo/Dockerfile Normal file
View File

@ -0,0 +1,7 @@
FROM alpine:3.13
RUN apk add --no-cache python3 py3-pip gnupg jq zlib zlib-dev cmake cmake-doc extra-cmake-modules extra-cmake-modules-doc build-base gcc abuild binutils binutils-doc gcc-doc yq bash coreutils util-linux pciutils usbutils coreutils binutils findutils grep && \
ln -sf /bin/bash /bin/sh && \
pip3 install 'yq==2.12.0'
CMD [ "bash", "/repo/make.sh" ]

7
repo/README.MD Normal file
View File

@ -0,0 +1,7 @@
# Piston Filesystem Repo Builder
This is just a simple POC for a repository tool to run locally.
This only demonstrates building an unsigned python-3.9.1 package, however if it finds an `asc` file it will include it as the signature.
Mount this whole directory into `/repo` in your API container if you wish to use it.

13
repo/make.sh Executable file
View File

@ -0,0 +1,13 @@
#!/bin/bash -e
cd /repo
# Make packages
pushd ../packages/python
make build VERSIONS=3.9.1
popd
# Make repo index
./mkindex.sh

26
repo/mkindex.sh Executable file
View File

@ -0,0 +1,26 @@
echo "schema: ppman-repo-1" > index.yaml
echo "baseurl: file://$PWD" >> index.yaml
echo "keys: []" >> index.yaml
echo "packages: []" >> index.yaml
yq -yi '.keys[0] = "0x107DA02C7AE97B084746564B9F1FD9D87950DB6F"' index.yaml
i=-1
for pkg in $(find ../packages -type f -name "*.pkg.tar.gz")
do
((i=i+1))
cp $pkg .
PKGFILE=$(basename $pkg)
PKGFILENAME=$(echo $PKGFILE | sed 's/\.pkg\.tar\.gz//g')
PKGNAME=$(echo $PKGFILENAME | grep -oP '^\K.+(?=-)')
PKGVERSION=$(echo $PKGFILENAME | grep -oP '^.+-\K.+')
BUILDFILE=https://github.com/engineer-man/piston/tree/v3/packages/python/
SIZE=$(tar tzvf $PKGFILE | sed 's/ \+/ /g' | cut -f3 -d' ' | sed '2,$s/^/+ /' | paste -sd' ' | bc)
tar xzf $PKGFILE pkg-info.json
yq -yi ".packages[$i] = {} | .packages[$i].signature = \"$(cat ${pkg}.asc)\" | .packages[$i].buildfile = \"$BUILDFILE\" | .packages[$i].size = $SIZE | .packages[$i].download = \"$PKGFILE\" | .packages[$i].dependencies = $(jq .dependencies -r pkg-info.json) | .packages[$i].author = $(jq .author pkg-info.json) | .packages[$i].language =\"$PKGNAME\" | .packages[$i].version = \"$PKGVERSION\" | .packages[$i].checksums = {} | .packages[$i].checksums.sha256 = \"$(sha256sum $PKGFILE | awk '{print $1}')\"" index.yaml
rm pkg-info.json
done