Compare commits

...

98 commits

Author SHA1 Message Date
scpchicken
c54bd64372 pkg(k-1.0.0): Added k 1.0.0 2025-02-08 20:42:07 -07:00
Omar Brikaa
1d55a41a2d
Explicitly provide env vars instead of inheriting them from parent (#703) 2025-02-08 20:46:46 +02:00
Omar Brikaa
6ef0cdf7b4
Provide HOME in sandbox (#702) 2025-02-08 15:10:50 +02:00
Omar Brikaa
4e361dcf92 Add note to ensure the repository is cloned with LF line endings 2024-10-11 21:44:30 +03:00
Omar Brikaa
512b63d2b5 Document interactive execution 2024-10-04 19:53:33 +03:00
Omar Brikaa
24c5c05308 Give friendlier messages when cgroup v2 is not enabled 2024-10-04 19:31:12 +03:00
Omar Brikaa
47661343da
Downgrade base docker images because some packages were built on the previous image version (#687) 2024-09-17 22:32:23 +03:00
Omar Brikaa
40b8598d2d
Merge pull request #685 from Brikaa/remove-no-socket-update-docs-sigkill-timeout-output-limit-exceeded-status
Remove nosocket, update docs, SIGKILL signal for timeout and output limit, output limit status
2024-09-15 21:40:12 +03:00
Omar Brikaa
c4afd97a38 Use pkgdir inside isolate sandbox to account for packages that have been built with a custom PREFIX
closes #686
2024-09-15 20:48:45 +03:00
Omar Brikaa
ecdced9ee7 Add SIGKILL signal for output limits and timeout, add status for output limits 2024-09-13 16:19:09 +03:00
Omar Brikaa
a99ce9ae47 Remove nosocket, update security principles in docs 2024-09-13 15:14:16 +03:00
Omar Brikaa
bd42fe3357
Improve isolation, execution limits and execution metrics by using Isolate (#683)
* Initial: use Isolate for isolation

* Continue: use Isolate for isolation

* Bug fixes

* timeout is wall-time for backward compatibility

* Documentation, signal names, reported time in ms

* Report memory usage in bytes

* Add privileged flags where needed

* Remove tmpfs

* Remove tmpfs

* Fix package installation

* Fix path, fix Zig: CRLF -> LF
2024-09-08 13:58:40 +12:00
Brian Seymour
59338eee33
Update readme.md (#660) 2024-06-25 11:59:19 +12:00
Damodar Lohani
c4cf018be2
add Dart 3.0.1 package (#602) 2024-04-18 23:40:14 +12:00
Kodie
684b47d2a2
pkg(node-20.11.0) Added Node 20.11.0 (#646) 2024-04-18 23:38:20 +12:00
Ahmed Wael
647bc3a7c7
handle stdout and stderr limits properly (#643)
* handle stdout and stderr limits proberly

Co-authored-by: Omar Brikaa <brikaaomar@gmail.com>

* added environment to docker compose

---------

Co-authored-by: Omar Brikaa <brikaaomar@gmail.com>
2024-01-26 19:41:28 +13:00
Thomas Hobson
b46690de06
Merge pull request #633 from Aetheridon/master
Python 3.12.0 support
2023-11-03 14:28:00 +13:00
Aetheridon
c97324beb3 Python 3.12.0 support 2023-11-01 21:29:36 +00:00
Thomas Hobson
a7fa1b47fe
Merge pull request #632 from ssahai/bugfix/catch_error
Handle process kills gracefully
2023-11-01 14:06:28 +13:00
Thomas Hobson
48102b612f
Merge pull request #623 from devnote-dev/update-crystal
Replace old Crystal version
2023-11-01 14:04:34 +13:00
Thomas Hobson
f785f655d5
Merge pull request #630 from Aetheridon/master
Added files for Python 3.11.0
2023-11-01 14:04:06 +13:00
Shubham Sahai
d8af1ee301 Try-Catch process kills to handle dead processes 2023-10-30 20:09:01 +08:00
Shubham Sahai
dc4bb294b6 bugfix: catch error - "e is not defined" 2023-10-26 02:42:28 +08:00
Aetheridon
18743a3369 Added files for Python 3.11.0 2023-10-19 14:47:45 +01:00
Thomas Hobson
37141e87f6
Merge pull request #624 from Brikaa/fix-job-cleanup-evasion-vulnerability
Fix job cleanup evasion vulnerability, improve job execution error handling
2023-10-09 10:49:47 +13:00
Thomas Hobson
fb658e1921
Merge pull request #627 from Brikaa/improve-containers-stopping-performance
Improve containers stopping performance by handling SIGTERM (95% improvement)
2023-10-09 10:48:48 +13:00
Omar Brikaa
016a8c086f exec comment 2023-10-03 15:21:48 +03:00
Omar Brikaa
fef00b96f1 Improve containers stopping performance by handling SIGTERM 2023-10-03 13:59:23 +03:00
Omar Brikaa
6a47869578 Comments explaining the try-catch flow 2023-09-16 21:37:09 +03:00
Omar Brikaa
040e19fdc2 Interactive execution: run job cleanup regardless of errors 2023-09-15 20:39:15 +03:00
Omar Brikaa
fe2fc374aa Improve normal execution error handling
- Properly differentiate between bad requests and internal server errors
- Avoid clean up evasion by putting the cleanup in the finally block
2023-09-15 20:26:10 +03:00
devnote-dev
f70ecdd8b4 feat(packages): replace old crystal version 2023-09-03 20:29:22 +01:00
Thomas Hobson
b9adb6f854
Merge pull request #619 from ccall48/master
add some additional py packages
2023-08-08 13:56:31 -07:00
Cory
ce852aa20d
add some additional py packages 2023-08-05 21:53:50 +10:00
Thomas Hobson
89e0dd431d
Merge pull request #613 from pablo-tx/master
Backport engineer-man#519 parallel requests fix
2023-07-16 16:15:58 +12:00
Pablo Pozo
11841b3202 Backport engineer-man#519 parallel requests fix 2023-07-14 13:55:40 +03:00
Thomas Hobson
919076e209
Merge pull request #603 from lorypelli/matl-22.7.4
Updated to `Matl 22.7.4`
2023-06-07 13:07:57 +12:00
Thomas Hobson
e45866535d
Merge pull request #596 from lorypelli/zig-0.10.1
Updated to `Zig 0.10.1`
2023-06-07 13:06:56 +12:00
RVG|lory
57076ee176
Updated to Matl 22.7.4 2023-05-24 18:20:30 +02:00
RVG|lory
ec22c2bbef
Updated to Zig 0.10.1 2023-04-18 14:27:44 +02:00
Thomas Hobson
c1ed7a7118
Merge pull request #574 from LoryPelli/mono-6.12.0
Updated `Mono 6.12.0 build.sh`
2023-04-11 01:14:37 +12:00
Thomas Hobson
8381f40388
Merge pull request #575 from LoryPelli/bash-5.2.0
Updated to `Bash 5.2.0`
2023-04-11 01:13:53 +12:00
Thomas Hobson
8cfdd337f9
Merge pull request #576 from LoryPelli/julia-1.8.5
Updated to `Julia 1.8.5`
2023-04-11 01:13:21 +12:00
Thomas Hobson
32813594d6
Merge pull request #577 from LoryPelli/lua-5.4.4
Updated to `Lua 5.4.4`
2023-04-11 01:12:26 +12:00
Thomas Hobson
b043f07565
Merge pull request #570 from LoryPelli/php-8.2.3
Updated to `PHP 8.2.3`
2023-04-11 01:12:02 +12:00
Thomas Hobson
fa4af90548
Merge pull request #579 from LoryPelli/nim-1.6.2
Updated to `Nim 1.6.2`
2023-04-11 01:11:33 +12:00
Thomas Hobson
89b173c36a
Merge pull request #580 from LoryPelli/octave-8.1.0
Updated to `Octave 8.1.0`
2023-04-11 01:10:53 +12:00
Thomas Hobson
dac6b765ab
Merge pull request #581 from LoryPelli/dart-2.19.6
Updated to `Dart 2.19.6`
2023-04-11 01:10:18 +12:00
Thomas Hobson
9a93c804bf
Merge pull request #585 from LoryPelli/scala-3.2.2
Updated to `Scala 3.2.2`
2023-04-11 01:09:52 +12:00
Thomas Hobson
539aad3873
Merge pull request #586 from LoryPelli/freebasic-1.9.0
Updated to `Freebasic 1.9.0`
2023-04-11 01:09:21 +12:00
Thomas Hobson
52d5c2262a
Merge pull request #587 from LoryPelli/kotlin-1.8.20
Updated to `Kotlin 1.8.20`
2023-04-11 01:08:40 +12:00
Thomas Hobson
388a9bca68
Merge pull request #588 from LoryPelli/pascal-3.2.2
Updated to `Pascal 3.2.2`
2023-04-11 01:08:15 +12:00
Thomas Hobson
54a4b6e449
Merge pull request #589 from LoryPelli/perl-5.36.0
Updated to `Perl 5.36.0`
2023-04-11 01:07:35 +12:00
Thomas Hobson
d4fff15197
Merge branch 'master' into perl-5.36.0 2023-04-11 01:07:27 +12:00
Thomas Hobson
acc449326e
Merge pull request #590 from LoryPelli/vlang-0.3.3
Updated to `Vlang 0.3.3`
2023-04-11 01:06:48 +12:00
RVG|lory
4c45648a79
Updated to Vlang 0.3.3 2023-04-09 12:24:30 +02:00
RVG|lory
1da99d47f1
Updated to Perl 5.36.0 2023-04-08 22:29:44 +02:00
RVG|lory
8a778f7033
Updated to Pascal 3.2.2 2023-04-08 22:07:40 +02:00
RVG|lory
6a48428970
Updated to Freebasic 1.9.0 2023-04-08 19:32:44 +02:00
RVG|lory
334e7820ff
Updated to Scala 3.2.2 2023-04-08 18:21:15 +02:00
RVG|lory
0c253a667d
Updated to Kotlin 1.8.20 2023-04-08 17:35:00 +02:00
RVG|lory
28314f16ab
Updated to Dart 2.19.6 2023-04-08 13:52:48 +02:00
RVG|lory
7ae50eaef7
Updated to Octave 8.1.0 2023-04-08 13:31:16 +02:00
RVG|lory
df553c80ea
Updated to Nim 1.6.2 2023-04-08 13:12:10 +02:00
RVG|lory
b65c5f84df
Updated to Lua 5.4.4 2023-04-08 11:58:48 +02:00
RVG|lory
6524db168f
Updated to Julia 1.8.5 2023-04-08 10:05:11 +02:00
RVG|lory
314a96a8d4
Updated to Bash 5.2.0 2023-04-08 09:52:41 +02:00
RVG|lory
2ae73c5dae
Updated Mono 6.12.0 build.sh 2023-04-08 09:03:57 +02:00
RVG|lory
91d4d402de
Updated to PHP 8.2.3 2023-04-07 19:40:10 +02:00
Thomas Hobson
86d897d580
Merge pull request #568 from LoryPelli/typescript-5.0.3
Updated to `Typescript 5.0.3`
2023-04-07 23:46:45 +12:00
Thomas Hobson
7209d8b12b
Merge branch 'master' into typescript-5.0.3 2023-04-07 23:41:15 +12:00
Thomas Hobson
f5cbf29ef2
Merge pull request #566 from LoryPelli/deno-1.32.3
Updated to `Deno 1.32.3`
2023-04-07 23:39:21 +12:00
Thomas Hobson
9e3745a0b7
Merge pull request #564 from LoryPelli/rust-1.68.2
Updated to `Rust 1.68.2`
2023-04-07 23:38:52 +12:00
RVG|lory
2c7073310e
Changed metadata ts version 2023-04-07 13:32:48 +02:00
RVG|lory
dfd5368fe3
Updated to Typescript 5.0.3 2023-04-07 13:24:15 +02:00
RVG|lory
624db95494
Add files via upload 2023-04-07 10:17:13 +02:00
RVG|lory
7bd4acb346
Add files via upload 2023-04-07 09:50:19 +02:00
Thomas Hobson
2d82118fff
Merge pull request #557 from LoryPelli/master
Updated to `Node 18.15.0 LTS`
2023-04-05 00:28:45 +12:00
RVG|lory
46a9478e6c
Changed newline from CRLF to LF 2023-04-04 14:04:49 +02:00
RVG|lory
ce36d4b0d0
Updated to Node 18.15.0 LTS 2023-04-04 07:07:16 +02:00
Thomas Hobson
3ef36c17a7
Merge pull request #555 from VoltrexKeyva/update-actions
chore(actions): update GitHub workflow actions
2023-04-03 15:26:50 +12:00
Mohammed Keyvanzadeh
4fcb275892
chore(actions): update GitHub workflow actions
Update the actions in the GitHub Actions workflows to their latest
versions.
2023-04-02 11:43:19 +03:30
Thomas Hobson
c92c2d0dcc
Merge pull request #544 from e-sp/fix-piston-bin
bin(piston): Fix ./piston list-pkgs
2023-02-28 14:23:33 +13:00
e-sp
2a5e6a5012 bin(piston): Fix ./piston list-pkgs 2023-02-27 23:54:52 +01:00
Thomas Hobson
b179e0bbbc
Merge pull request #530 from Endercheif/master
pkg(samarium-0.3.1): Added samarium 0.3.1
2023-02-20 14:19:21 +13:00
Thomas Hobson
67c754e1f4
Merge branch 'master' into master 2023-02-20 14:10:31 +13:00
Thomas Hobson
92f69b7f73
Merge pull request #540 from materemias/patch-deno-disable-colors
disable output coloring of deno
2023-02-20 09:32:38 +13:00
Thomas Hobson
ae7630e47e
Merge branch 'master' into master 2023-02-20 09:26:48 +13:00
materemias
32012483a3
disable output coloring of deno 2023-02-17 15:17:00 +01:00
Endercheif
cbc4db7ada
fix: installing samarium with pip 2023-02-05 18:56:59 -08:00
Thomas Hobson
17c4fdb51e
Merge pull request #537 from Vrganj/master
return 200 and piston ver on /, fix empty content-type header validation
2023-01-12 15:21:06 +13:00
Luka Barbić
e86c19b007 return 200 and piston ver on /, fix empty content-type header validation 2023-01-11 18:59:29 +01:00
Thomas Hobson
7441f2633d
Merge pull request #531 from engineer-man/dependabot/npm_and_yarn/api/express-4.17.3
build(deps): bump express from 4.17.1 to 4.17.3 in /api
2023-01-10 17:25:43 +13:00
Endercheif
9de88e2f6c
fix: install samarium from python3.10 2022-12-18 20:47:44 -08:00
Endercheif
e9426d6c03
fix: pip install from github 2022-12-14 17:42:55 -08:00
dependabot[bot]
5ae85c383f
build(deps): bump express from 4.17.1 to 4.17.3 in /api
Bumps [express](https://github.com/expressjs/express) from 4.17.1 to 4.17.3.
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/master/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.17.1...4.17.3)

---
updated-dependencies:
- dependency-name: express
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-12-13 22:59:05 +00:00
Endercheif
081ed1bac1
pip3 -> pip 2022-12-08 00:04:32 -08:00
Endercheif
3f8ef8f524
Add support for samarium (#1)
* pkg(samarium-0.3.1): added samarium

* fix: use pip3

* fix: set version explicitly
2022-12-08 02:54:00 -05:00
171 changed files with 1670 additions and 764 deletions

View file

@ -13,22 +13,22 @@ jobs:
name: Build and Push Docker image to Github Packages name: Build and Push Docker image to Github Packages
steps: steps:
- name: Check out repo - name: Check out repo
uses: actions/checkout@v2 uses: actions/checkout@v3
- name: Login to GitHub registry - name: Login to GitHub registry
uses: docker/login-action@v1 uses: docker/login-action@v2
with: with:
username: ${{ github.actor }} username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
registry: docker.pkg.github.com registry: docker.pkg.github.com
- name: Login to ghcr.io - name: Login to ghcr.io
uses: docker/login-action@v1 uses: docker/login-action@v2
with: with:
username: ${{ github.actor }} username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
registry: ghcr.io registry: ghcr.io
- name: Build and push API - name: Build and push API
uses: docker/build-push-action@v2 uses: docker/build-push-action@v4
with: with:
context: api context: api
push: true push: true

View file

@ -15,7 +15,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v2 uses: actions/checkout@v3
- name: Get list of changed files - name: Get list of changed files
uses: lots0logs/gh-action-get-changed-files@2.1.4 uses: lots0logs/gh-action-get-changed-files@2.1.4
with: with:
@ -36,10 +36,10 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v2 uses: actions/checkout@v3
- name: Login to GitHub registry - name: Login to GitHub registry
uses: docker/login-action@v1 uses: docker/login-action@v2
with: with:
username: ${{ github.actor }} username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
@ -60,7 +60,7 @@ jobs:
ls -la packages ls -la packages
- name: Upload package as artifact - name: Upload package as artifact
uses: actions/upload-artifact@v2 uses: actions/upload-artifact@v3
with: with:
name: packages name: packages
path: packages/*.pkg.tar.gz path: packages/*.pkg.tar.gz
@ -70,9 +70,9 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
needs: build-pkg needs: build-pkg
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v3
- uses: actions/download-artifact@v2 - uses: actions/download-artifact@v3
with: with:
name: packages name: packages
@ -80,7 +80,7 @@ jobs:
run: mv *.pkg.tar.gz packages/ run: mv *.pkg.tar.gz packages/
- name: Login to GitHub registry - name: Login to GitHub registry
uses: docker/login-action@v1 uses: docker/login-action@v2
with: with:
username: ${{ github.actor }} username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
@ -92,7 +92,7 @@ jobs:
docker run -v $(pwd)'/repo:/piston/repo' -v $(pwd)'/packages:/piston/packages' -d --name repo docker.pkg.github.com/engineer-man/piston/repo-builder --no-build docker run -v $(pwd)'/repo:/piston/repo' -v $(pwd)'/packages:/piston/packages' -d --name repo docker.pkg.github.com/engineer-man/piston/repo-builder --no-build
docker pull docker.pkg.github.com/engineer-man/piston/api docker pull docker.pkg.github.com/engineer-man/piston/api
docker build -t piston-api api docker build -t piston-api api
docker run --network container:repo -v $(pwd)'/data:/piston' -e PISTON_LOG_LEVEL=DEBUG -e 'PISTON_REPO_URL=http://localhost:8000/index' -d --name api piston-api docker run --privileged --network container:repo -v $(pwd)'/data:/piston' -e PISTON_LOG_LEVEL=DEBUG -e 'PISTON_REPO_URL=http://localhost:8000/index' -d --name api piston-api
echo Waiting for API to start.. echo Waiting for API to start..
docker run --network container:api appropriate/curl -s --retry 10 --retry-connrefused http://localhost:2000/api/v2/runtimes docker run --network container:api appropriate/curl -s --retry 10 --retry-connrefused http://localhost:2000/api/v2/runtimes

View file

@ -14,10 +14,10 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v2 uses: actions/checkout@v3
- name: Login to GitHub registry - name: Login to GitHub registry
uses: docker/login-action@v1 uses: docker/login-action@v2
with: with:
username: ${{ github.actor }} username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}

View file

@ -13,16 +13,16 @@ jobs:
name: Build and Push Docker image to Github Packages name: Build and Push Docker image to Github Packages
steps: steps:
- name: Check out repo - name: Check out repo
uses: actions/checkout@v2 uses: actions/checkout@v3
- name: Login to GitHub registry - name: Login to GitHub registry
uses: docker/login-action@v1 uses: docker/login-action@v2
with: with:
username: ${{ github.actor }} username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
registry: docker.pkg.github.com registry: docker.pkg.github.com
- name: Build and push repo - name: Build and push repo
uses: docker/build-push-action@v2 uses: docker/build-push-action@v4
with: with:
context: repo context: repo
pull: true pull: true

View file

@ -1,20 +1,29 @@
FROM buildpack-deps:buster AS isolate
RUN apt-get update && \
apt-get install -y --no-install-recommends git libcap-dev && \
rm -rf /var/lib/apt/lists/* && \
git clone https://github.com/envicutor/isolate.git /tmp/isolate/ && \
cd /tmp/isolate && \
git checkout af6db68042c3aa0ded80787fbb78bc0846ea2114 && \
make -j$(nproc) install && \
rm -rf /tmp/*
FROM node:15.10.0-buster-slim FROM node:15.10.0-buster-slim
ENV DEBIAN_FRONTEND=noninteractive ENV DEBIAN_FRONTEND=noninteractive
RUN dpkg-reconfigure -p critical dash RUN dpkg-reconfigure -p critical dash
RUN for i in $(seq 1001 1500); do \
groupadd -g $i runner$i && \
useradd -M runner$i -g $i -u $i ; \
done
RUN apt-get update && \ RUN apt-get update && \
apt-get install -y libxml2 gnupg tar coreutils util-linux libc6-dev \ apt-get install -y libxml2 gnupg tar coreutils util-linux libc6-dev \
binutils build-essential locales libpcre3-dev libevent-dev libgmp3-dev \ binutils build-essential locales libpcre3-dev libevent-dev libgmp3-dev \
libncurses6 libncurses5 libedit-dev libseccomp-dev rename procps python3 \ libncurses6 libncurses5 libedit-dev libseccomp-dev rename procps python3 \
libreadline-dev libblas-dev liblapack-dev libpcre3-dev libarpack2-dev \ libreadline-dev libblas-dev liblapack-dev libpcre3-dev libarpack2-dev \
libfftw3-dev libglpk-dev libqhull-dev libqrupdate-dev libsuitesparse-dev \ libfftw3-dev libglpk-dev libqhull-dev libqrupdate-dev libsuitesparse-dev \
libsundials-dev libpcre2-dev && \ libsundials-dev libpcre2-dev libcap-dev && \
rm -rf /var/lib/apt/lists/* rm -rf /var/lib/apt/lists/*
RUN useradd -M piston
COPY --from=isolate /usr/local/bin/isolate /usr/local/bin
COPY --from=isolate /usr/local/etc/isolate /usr/local/etc/isolate
RUN sed -i '/en_US.UTF-8/s/^# //g' /etc/locale.gen && locale-gen RUN sed -i '/en_US.UTF-8/s/^# //g' /etc/locale.gen && locale-gen
@ -23,7 +32,5 @@ COPY ["package.json", "package-lock.json", "./"]
RUN npm install RUN npm install
COPY ./src ./src COPY ./src ./src
RUN make -C ./src/nosocket/ all && make -C ./src/nosocket/ install CMD ["/piston_api/src/docker-entrypoint.sh"]
CMD [ "node", "src"]
EXPOSE 2000/tcp EXPOSE 2000/tcp

395
api/package-lock.json generated
View file

@ -11,7 +11,7 @@
"dependencies": { "dependencies": {
"body-parser": "^1.19.0", "body-parser": "^1.19.0",
"chownr": "^2.0.0", "chownr": "^2.0.0",
"express": "^4.17.1", "express": "^4.17.3",
"express-ws": "^5.0.2", "express-ws": "^5.0.2",
"is-docker": "^2.1.1", "is-docker": "^2.1.1",
"logplease": "^1.2.15", "logplease": "^1.2.15",
@ -23,12 +23,12 @@
} }
}, },
"node_modules/accepts": { "node_modules/accepts": {
"version": "1.3.7", "version": "1.3.8",
"resolved": "https://registry.npmjs.org/accepts/-/accepts-1.3.7.tgz", "resolved": "https://registry.npmjs.org/accepts/-/accepts-1.3.8.tgz",
"integrity": "sha512-Il80Qs2WjYlJIBNzNkK6KYqlVMTbZLXgHx2oT0pU/fjRHyEp+PEfEPY0R3WCwAGVOtauxh1hOxNgIf5bv7dQpA==", "integrity": "sha512-PYAthTa2m2VKxuvSD3DPC/Gy+U+sOA1LAuT8mkmRuvw+NACSaeXEQ+NHcVF7rONl6qcaxV3Uuemwawk+7+SJLw==",
"dependencies": { "dependencies": {
"mime-types": "~2.1.24", "mime-types": "~2.1.34",
"negotiator": "0.6.2" "negotiator": "0.6.3"
}, },
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
@ -40,29 +40,29 @@
"integrity": "sha1-ml9pkFGx5wczKPKgCJaLZOopVdI=" "integrity": "sha1-ml9pkFGx5wczKPKgCJaLZOopVdI="
}, },
"node_modules/body-parser": { "node_modules/body-parser": {
"version": "1.19.0", "version": "1.19.2",
"resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.19.0.tgz", "resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.19.2.tgz",
"integrity": "sha512-dhEPs72UPbDnAQJ9ZKMNTP6ptJaionhP5cBb541nXPlW60Jepo9RV/a4fX4XWW9CuFNK22krhrj1+rgzifNCsw==", "integrity": "sha512-SAAwOxgoCKMGs9uUAUFHygfLAyaniaoun6I8mFY9pRAJL9+Kec34aU+oIjDhTycub1jozEfEwx1W1IuOYxVSFw==",
"dependencies": { "dependencies": {
"bytes": "3.1.0", "bytes": "3.1.2",
"content-type": "~1.0.4", "content-type": "~1.0.4",
"debug": "2.6.9", "debug": "2.6.9",
"depd": "~1.1.2", "depd": "~1.1.2",
"http-errors": "1.7.2", "http-errors": "1.8.1",
"iconv-lite": "0.4.24", "iconv-lite": "0.4.24",
"on-finished": "~2.3.0", "on-finished": "~2.3.0",
"qs": "6.7.0", "qs": "6.9.7",
"raw-body": "2.4.0", "raw-body": "2.4.3",
"type-is": "~1.6.17" "type-is": "~1.6.18"
}, },
"engines": { "engines": {
"node": ">= 0.8" "node": ">= 0.8"
} }
}, },
"node_modules/bytes": { "node_modules/bytes": {
"version": "3.1.0", "version": "3.1.2",
"resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.0.tgz", "resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.2.tgz",
"integrity": "sha512-zauLjrfCG+xvoyaqLoV8bLVXXNGC4JqlxFCutSDWA6fJrTo2ZuvLYTqZ7aHBLZSMOopbzwv8f+wZcVzfVTI2Dg==", "integrity": "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg==",
"engines": { "engines": {
"node": ">= 0.8" "node": ">= 0.8"
} }
@ -76,11 +76,11 @@
} }
}, },
"node_modules/content-disposition": { "node_modules/content-disposition": {
"version": "0.5.3", "version": "0.5.4",
"resolved": "https://registry.npmjs.org/content-disposition/-/content-disposition-0.5.3.tgz", "resolved": "https://registry.npmjs.org/content-disposition/-/content-disposition-0.5.4.tgz",
"integrity": "sha512-ExO0774ikEObIAEV9kDo50o+79VCUdEB6n6lzKgGwupcVeRlhrj3qGAfwq8G6uBJjkqLrhT0qEYFcWng8z1z0g==", "integrity": "sha512-FveZTNuGw04cxlAiWbzi6zTAL/lhehaWbTtgluJh4/E95DqMwTmha3KZN1aAWA8cFIhHzMZUvLevkw5Rqk+tSQ==",
"dependencies": { "dependencies": {
"safe-buffer": "5.1.2" "safe-buffer": "5.2.1"
}, },
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
@ -95,9 +95,9 @@
} }
}, },
"node_modules/cookie": { "node_modules/cookie": {
"version": "0.4.0", "version": "0.4.2",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.4.0.tgz", "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.4.2.tgz",
"integrity": "sha512-+Hp8fLp57wnUSt0tY0tHEXh4voZRDnoIrZPqlo3DPiI4y9lwg/jqx+1Om94/W6ZaPDOUbnjOt/99w66zk+l1Xg==", "integrity": "sha512-aSWTXFzaKWkvHO1Ny/s+ePFpvKsPnjc551iI41v3ny/ow6tBG5Vd+FuqGNhh1LxOmVzOlGUriIlOaokOvhaStA==",
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
} }
@ -126,7 +126,7 @@
"node_modules/destroy": { "node_modules/destroy": {
"version": "1.0.4", "version": "1.0.4",
"resolved": "https://registry.npmjs.org/destroy/-/destroy-1.0.4.tgz", "resolved": "https://registry.npmjs.org/destroy/-/destroy-1.0.4.tgz",
"integrity": "sha1-l4hXRCxEdJ5CBmE+N5RiBYJqvYA=" "integrity": "sha512-3NdhDuEXnfun/z7x9GOElY49LoqVHoGScmOKwmxhsS8N5Y+Z8KyPPDnaSzqWgYt/ji4mqwfTS34Htrk0zPIXVg=="
}, },
"node_modules/ee-first": { "node_modules/ee-first": {
"version": "1.1.1", "version": "1.1.1",
@ -149,22 +149,22 @@
"node_modules/etag": { "node_modules/etag": {
"version": "1.8.1", "version": "1.8.1",
"resolved": "https://registry.npmjs.org/etag/-/etag-1.8.1.tgz", "resolved": "https://registry.npmjs.org/etag/-/etag-1.8.1.tgz",
"integrity": "sha1-Qa4u62XvpiJorr/qg6x9eSmbCIc=", "integrity": "sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg==",
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
} }
}, },
"node_modules/express": { "node_modules/express": {
"version": "4.17.1", "version": "4.17.3",
"resolved": "https://registry.npmjs.org/express/-/express-4.17.1.tgz", "resolved": "https://registry.npmjs.org/express/-/express-4.17.3.tgz",
"integrity": "sha512-mHJ9O79RqluphRrcw2X/GTh3k9tVv8YcoyY4Kkh4WDMUYKRZUq0h1o0w2rrrxBqM7VoeUVqgb27xlEMXTnYt4g==", "integrity": "sha512-yuSQpz5I+Ch7gFrPCk4/c+dIBKlQUxtgwqzph132bsT6qhuzss6I8cLJQz7B3rFblzd6wtcI0ZbGltH/C4LjUg==",
"dependencies": { "dependencies": {
"accepts": "~1.3.7", "accepts": "~1.3.8",
"array-flatten": "1.1.1", "array-flatten": "1.1.1",
"body-parser": "1.19.0", "body-parser": "1.19.2",
"content-disposition": "0.5.3", "content-disposition": "0.5.4",
"content-type": "~1.0.4", "content-type": "~1.0.4",
"cookie": "0.4.0", "cookie": "0.4.2",
"cookie-signature": "1.0.6", "cookie-signature": "1.0.6",
"debug": "2.6.9", "debug": "2.6.9",
"depd": "~1.1.2", "depd": "~1.1.2",
@ -178,13 +178,13 @@
"on-finished": "~2.3.0", "on-finished": "~2.3.0",
"parseurl": "~1.3.3", "parseurl": "~1.3.3",
"path-to-regexp": "0.1.7", "path-to-regexp": "0.1.7",
"proxy-addr": "~2.0.5", "proxy-addr": "~2.0.7",
"qs": "6.7.0", "qs": "6.9.7",
"range-parser": "~1.2.1", "range-parser": "~1.2.1",
"safe-buffer": "5.1.2", "safe-buffer": "5.2.1",
"send": "0.17.1", "send": "0.17.2",
"serve-static": "1.14.1", "serve-static": "1.14.2",
"setprototypeof": "1.1.1", "setprototypeof": "1.2.0",
"statuses": "~1.5.0", "statuses": "~1.5.0",
"type-is": "~1.6.18", "type-is": "~1.6.18",
"utils-merge": "1.0.1", "utils-merge": "1.0.1",
@ -226,9 +226,9 @@
} }
}, },
"node_modules/forwarded": { "node_modules/forwarded": {
"version": "0.1.2", "version": "0.2.0",
"resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.1.2.tgz", "resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
"integrity": "sha1-mMI9qxF1ZXuMBXPozszZGw/xjIQ=", "integrity": "sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow==",
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
} }
@ -236,21 +236,21 @@
"node_modules/fresh": { "node_modules/fresh": {
"version": "0.5.2", "version": "0.5.2",
"resolved": "https://registry.npmjs.org/fresh/-/fresh-0.5.2.tgz", "resolved": "https://registry.npmjs.org/fresh/-/fresh-0.5.2.tgz",
"integrity": "sha1-PYyt2Q2XZWn6g1qx+OSyOhBWBac=", "integrity": "sha512-zJ2mQYM18rEFOudeV4GShTGIQ7RbzA7ozbU9I/XBpm7kqgMywgmylMwXHxZJmkVoYkna9d2pVXVXPdYTP9ej8Q==",
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
} }
}, },
"node_modules/http-errors": { "node_modules/http-errors": {
"version": "1.7.2", "version": "1.8.1",
"resolved": "https://registry.npmjs.org/http-errors/-/http-errors-1.7.2.tgz", "resolved": "https://registry.npmjs.org/http-errors/-/http-errors-1.8.1.tgz",
"integrity": "sha512-uUQBt3H/cSIVfch6i1EuPNy/YsRSOUBXTVfZ+yR7Zjez3qjBz6i9+i4zjNaoqcoFVI4lQJ5plg63TvGfRSDCRg==", "integrity": "sha512-Kpk9Sm7NmI+RHhnj6OIWDI1d6fIoFAtFt9RLaTMRlg/8w49juAStsrBgp0Dp4OdxdVbRIeKhtCUvoi/RuAhO4g==",
"dependencies": { "dependencies": {
"depd": "~1.1.2", "depd": "~1.1.2",
"inherits": "2.0.3", "inherits": "2.0.4",
"setprototypeof": "1.1.1", "setprototypeof": "1.2.0",
"statuses": ">= 1.5.0 < 2", "statuses": ">= 1.5.0 < 2",
"toidentifier": "1.0.0" "toidentifier": "1.0.1"
}, },
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
@ -268,9 +268,9 @@
} }
}, },
"node_modules/inherits": { "node_modules/inherits": {
"version": "2.0.3", "version": "2.0.4",
"resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.3.tgz", "resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz",
"integrity": "sha1-Yzwsg+PaQqUC9SRmAiSA9CCCYd4=" "integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ=="
}, },
"node_modules/ipaddr.js": { "node_modules/ipaddr.js": {
"version": "1.9.1", "version": "1.9.1",
@ -340,19 +340,19 @@
} }
}, },
"node_modules/mime-db": { "node_modules/mime-db": {
"version": "1.46.0", "version": "1.52.0",
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.46.0.tgz", "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
"integrity": "sha512-svXaP8UQRZ5K7or+ZmfNhg2xX3yKDMUzqadsSqi4NCH/KomcH75MAMYAGVlvXn4+b/xOPhS3I2uHKRUzvjY7BQ==", "integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
} }
}, },
"node_modules/mime-types": { "node_modules/mime-types": {
"version": "2.1.29", "version": "2.1.35",
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.29.tgz", "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
"integrity": "sha512-Y/jMt/S5sR9OaqteJtslsFZKWOIIqMACsJSiHghlCAyhf7jfVYjKBmLiX8OgpWeW+fjJ2b+Az69aPFPkUOY6xQ==", "integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
"dependencies": { "dependencies": {
"mime-db": "1.46.0" "mime-db": "1.52.0"
}, },
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
@ -364,9 +364,9 @@
"integrity": "sha1-VgiurfwAvmwpAd9fmGF4jeDVl8g=" "integrity": "sha1-VgiurfwAvmwpAd9fmGF4jeDVl8g="
}, },
"node_modules/negotiator": { "node_modules/negotiator": {
"version": "0.6.2", "version": "0.6.3",
"resolved": "https://registry.npmjs.org/negotiator/-/negotiator-0.6.2.tgz", "resolved": "https://registry.npmjs.org/negotiator/-/negotiator-0.6.3.tgz",
"integrity": "sha512-hZXc7K2e+PgeI1eDBe/10Ard4ekbfrrqG8Ep+8Jmf4JID2bNg7NvCPOZN+kfF574pFQI7mum2AUqDidoKqcTOw==", "integrity": "sha512-+EUsqGPLsM+j/zdChZjsnX51g4XrHFOIXwfnCVPGlQk/k5giakcKsuxCObBRu6DSm9opw/O6slWbJdghQM4bBg==",
"engines": { "engines": {
"node": ">= 0.6" "node": ">= 0.6"
} }
@ -407,11 +407,11 @@
"integrity": "sha1-32BBeABfUi8V60SQ5yR6G/qmf4w=" "integrity": "sha1-32BBeABfUi8V60SQ5yR6G/qmf4w="
}, },
"node_modules/proxy-addr": { "node_modules/proxy-addr": {
"version": "2.0.6", "version": "2.0.7",
"resolved": "https://registry.npmjs.org/proxy-addr/-/proxy-addr-2.0.6.tgz", "resolved": "https://registry.npmjs.org/proxy-addr/-/proxy-addr-2.0.7.tgz",
"integrity": "sha512-dh/frvCBVmSsDYzw6n926jv974gddhkFPfiN8hPOi30Wax25QZyZEGveluCgliBnqmuM+UJmBErbAUFIoDbjOw==", "integrity": "sha512-llQsMLSUDUPT44jdrU/O37qlnifitDP+ZwrmmZcoSKyLKvtZxpyV0n2/bD/N4tBAAZ/gJEdZU7KMraoK1+XYAg==",
"dependencies": { "dependencies": {
"forwarded": "~0.1.2", "forwarded": "0.2.0",
"ipaddr.js": "1.9.1" "ipaddr.js": "1.9.1"
}, },
"engines": { "engines": {
@ -419,11 +419,14 @@
} }
}, },
"node_modules/qs": { "node_modules/qs": {
"version": "6.7.0", "version": "6.9.7",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.7.0.tgz", "resolved": "https://registry.npmjs.org/qs/-/qs-6.9.7.tgz",
"integrity": "sha512-VCdBRNFTX1fyE7Nb6FYoURo/SPe62QCaAyzJvUjwRaIsc+NePBEniHlvxFmmX56+HZphIGtV0XeCirBtpDrTyQ==", "integrity": "sha512-IhMFgUmuNpyRfxA90umL7ByLlgRXu6tIfKPpF5TmcfRLlLCckfP/g3IQmju6jjpu+Hh8rA+2p6A27ZSPOOHdKw==",
"engines": { "engines": {
"node": ">=0.6" "node": ">=0.6"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
} }
}, },
"node_modules/range-parser": { "node_modules/range-parser": {
@ -435,12 +438,12 @@
} }
}, },
"node_modules/raw-body": { "node_modules/raw-body": {
"version": "2.4.0", "version": "2.4.3",
"resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.4.0.tgz", "resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.4.3.tgz",
"integrity": "sha512-4Oz8DUIwdvoa5qMJelxipzi/iJIi40O5cGV1wNYp5hvZP8ZN0T+jiNkL0QepXs+EsQ9XJ8ipEDoiH70ySUJP3Q==", "integrity": "sha512-UlTNLIcu0uzb4D2f4WltY6cVjLi+/jEN4lgEUj3E04tpMDpUlkBo/eSn6zou9hum2VMNpCCUone0O0WeJim07g==",
"dependencies": { "dependencies": {
"bytes": "3.1.0", "bytes": "3.1.2",
"http-errors": "1.7.2", "http-errors": "1.8.1",
"iconv-lite": "0.4.24", "iconv-lite": "0.4.24",
"unpipe": "1.0.0" "unpipe": "1.0.0"
}, },
@ -449,9 +452,23 @@
} }
}, },
"node_modules/safe-buffer": { "node_modules/safe-buffer": {
"version": "5.1.2", "version": "5.2.1",
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz", "resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz",
"integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==" "integrity": "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
]
}, },
"node_modules/safer-buffer": { "node_modules/safer-buffer": {
"version": "2.1.2", "version": "2.1.2",
@ -473,9 +490,9 @@
} }
}, },
"node_modules/send": { "node_modules/send": {
"version": "0.17.1", "version": "0.17.2",
"resolved": "https://registry.npmjs.org/send/-/send-0.17.1.tgz", "resolved": "https://registry.npmjs.org/send/-/send-0.17.2.tgz",
"integrity": "sha512-BsVKsiGcQMFwT8UxypobUKyv7irCNRHk1T0G680vk88yf6LBByGcZJOTJCrTP2xVN6yI+XjPJcNuE3V4fT9sAg==", "integrity": "sha512-UJYB6wFSJE3G00nEivR5rgWp8c2xXvJ3OPWPhmuteU0IKj8nKbG3DrjiOmLwpnHGYWAVwA69zmTm++YG0Hmwww==",
"dependencies": { "dependencies": {
"debug": "2.6.9", "debug": "2.6.9",
"depd": "~1.1.2", "depd": "~1.1.2",
@ -484,9 +501,9 @@
"escape-html": "~1.0.3", "escape-html": "~1.0.3",
"etag": "~1.8.1", "etag": "~1.8.1",
"fresh": "0.5.2", "fresh": "0.5.2",
"http-errors": "~1.7.2", "http-errors": "1.8.1",
"mime": "1.6.0", "mime": "1.6.0",
"ms": "2.1.1", "ms": "2.1.3",
"on-finished": "~2.3.0", "on-finished": "~2.3.0",
"range-parser": "~1.2.1", "range-parser": "~1.2.1",
"statuses": "~1.5.0" "statuses": "~1.5.0"
@ -496,28 +513,28 @@
} }
}, },
"node_modules/send/node_modules/ms": { "node_modules/send/node_modules/ms": {
"version": "2.1.1", "version": "2.1.3",
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.1.tgz", "resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
"integrity": "sha512-tgp+dl5cGk28utYktBsrFqA7HKgrhgPsg6Z/EfhWI4gl1Hwq8B/GmY/0oXZ6nF8hDVesS/FpnYaD/kOWhYQvyg==" "integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="
}, },
"node_modules/serve-static": { "node_modules/serve-static": {
"version": "1.14.1", "version": "1.14.2",
"resolved": "https://registry.npmjs.org/serve-static/-/serve-static-1.14.1.tgz", "resolved": "https://registry.npmjs.org/serve-static/-/serve-static-1.14.2.tgz",
"integrity": "sha512-JMrvUwE54emCYWlTI+hGrGv5I8dEwmco/00EvkzIIsR7MqrHonbD9pO2MOfFnpFntl7ecpZs+3mW+XbQZu9QCg==", "integrity": "sha512-+TMNA9AFxUEGuC0z2mevogSnn9MXKb4fa7ngeRMJaaGv8vTwnIEkKi+QGvPt33HSnf8pRS+WGM0EbMtCJLKMBQ==",
"dependencies": { "dependencies": {
"encodeurl": "~1.0.2", "encodeurl": "~1.0.2",
"escape-html": "~1.0.3", "escape-html": "~1.0.3",
"parseurl": "~1.3.3", "parseurl": "~1.3.3",
"send": "0.17.1" "send": "0.17.2"
}, },
"engines": { "engines": {
"node": ">= 0.8.0" "node": ">= 0.8.0"
} }
}, },
"node_modules/setprototypeof": { "node_modules/setprototypeof": {
"version": "1.1.1", "version": "1.2.0",
"resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.1.1.tgz", "resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.2.0.tgz",
"integrity": "sha512-JvdAWfbXeIGaZ9cILp38HntZSFSo3mWg6xGcJJsd+d4aRMOqauag1C63dJfDw7OaMYwEbHMOxEZ1lqVRYP2OAw==" "integrity": "sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw=="
}, },
"node_modules/statuses": { "node_modules/statuses": {
"version": "1.5.0", "version": "1.5.0",
@ -528,9 +545,9 @@
} }
}, },
"node_modules/toidentifier": { "node_modules/toidentifier": {
"version": "1.0.0", "version": "1.0.1",
"resolved": "https://registry.npmjs.org/toidentifier/-/toidentifier-1.0.0.tgz", "resolved": "https://registry.npmjs.org/toidentifier/-/toidentifier-1.0.1.tgz",
"integrity": "sha512-yaOH/Pk/VEhBWWTlhI+qXxDFXlejDGcQipMlyxda9nthulaxLZUNcUqFxokp0vcYnvteJln5FNQDRrxj3YcbVw==", "integrity": "sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA==",
"engines": { "engines": {
"node": ">=0.6" "node": ">=0.6"
} }
@ -611,12 +628,12 @@
}, },
"dependencies": { "dependencies": {
"accepts": { "accepts": {
"version": "1.3.7", "version": "1.3.8",
"resolved": "https://registry.npmjs.org/accepts/-/accepts-1.3.7.tgz", "resolved": "https://registry.npmjs.org/accepts/-/accepts-1.3.8.tgz",
"integrity": "sha512-Il80Qs2WjYlJIBNzNkK6KYqlVMTbZLXgHx2oT0pU/fjRHyEp+PEfEPY0R3WCwAGVOtauxh1hOxNgIf5bv7dQpA==", "integrity": "sha512-PYAthTa2m2VKxuvSD3DPC/Gy+U+sOA1LAuT8mkmRuvw+NACSaeXEQ+NHcVF7rONl6qcaxV3Uuemwawk+7+SJLw==",
"requires": { "requires": {
"mime-types": "~2.1.24", "mime-types": "~2.1.34",
"negotiator": "0.6.2" "negotiator": "0.6.3"
} }
}, },
"array-flatten": { "array-flatten": {
@ -625,26 +642,26 @@
"integrity": "sha1-ml9pkFGx5wczKPKgCJaLZOopVdI=" "integrity": "sha1-ml9pkFGx5wczKPKgCJaLZOopVdI="
}, },
"body-parser": { "body-parser": {
"version": "1.19.0", "version": "1.19.2",
"resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.19.0.tgz", "resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.19.2.tgz",
"integrity": "sha512-dhEPs72UPbDnAQJ9ZKMNTP6ptJaionhP5cBb541nXPlW60Jepo9RV/a4fX4XWW9CuFNK22krhrj1+rgzifNCsw==", "integrity": "sha512-SAAwOxgoCKMGs9uUAUFHygfLAyaniaoun6I8mFY9pRAJL9+Kec34aU+oIjDhTycub1jozEfEwx1W1IuOYxVSFw==",
"requires": { "requires": {
"bytes": "3.1.0", "bytes": "3.1.2",
"content-type": "~1.0.4", "content-type": "~1.0.4",
"debug": "2.6.9", "debug": "2.6.9",
"depd": "~1.1.2", "depd": "~1.1.2",
"http-errors": "1.7.2", "http-errors": "1.8.1",
"iconv-lite": "0.4.24", "iconv-lite": "0.4.24",
"on-finished": "~2.3.0", "on-finished": "~2.3.0",
"qs": "6.7.0", "qs": "6.9.7",
"raw-body": "2.4.0", "raw-body": "2.4.3",
"type-is": "~1.6.17" "type-is": "~1.6.18"
} }
}, },
"bytes": { "bytes": {
"version": "3.1.0", "version": "3.1.2",
"resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.0.tgz", "resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.2.tgz",
"integrity": "sha512-zauLjrfCG+xvoyaqLoV8bLVXXNGC4JqlxFCutSDWA6fJrTo2ZuvLYTqZ7aHBLZSMOopbzwv8f+wZcVzfVTI2Dg==" "integrity": "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg=="
}, },
"chownr": { "chownr": {
"version": "2.0.0", "version": "2.0.0",
@ -652,11 +669,11 @@
"integrity": "sha512-bIomtDF5KGpdogkLd9VspvFzk9KfpyyGlS8YFVZl7TGPBHL5snIOnxeshwVgPteQ9b4Eydl+pVbIyE1DcvCWgQ==" "integrity": "sha512-bIomtDF5KGpdogkLd9VspvFzk9KfpyyGlS8YFVZl7TGPBHL5snIOnxeshwVgPteQ9b4Eydl+pVbIyE1DcvCWgQ=="
}, },
"content-disposition": { "content-disposition": {
"version": "0.5.3", "version": "0.5.4",
"resolved": "https://registry.npmjs.org/content-disposition/-/content-disposition-0.5.3.tgz", "resolved": "https://registry.npmjs.org/content-disposition/-/content-disposition-0.5.4.tgz",
"integrity": "sha512-ExO0774ikEObIAEV9kDo50o+79VCUdEB6n6lzKgGwupcVeRlhrj3qGAfwq8G6uBJjkqLrhT0qEYFcWng8z1z0g==", "integrity": "sha512-FveZTNuGw04cxlAiWbzi6zTAL/lhehaWbTtgluJh4/E95DqMwTmha3KZN1aAWA8cFIhHzMZUvLevkw5Rqk+tSQ==",
"requires": { "requires": {
"safe-buffer": "5.1.2" "safe-buffer": "5.2.1"
} }
}, },
"content-type": { "content-type": {
@ -665,9 +682,9 @@
"integrity": "sha512-hIP3EEPs8tB9AT1L+NUqtwOAps4mk2Zob89MWXMHjHWg9milF/j4osnnQLXBCBFBk/tvIG/tUc9mOUJiPBhPXA==" "integrity": "sha512-hIP3EEPs8tB9AT1L+NUqtwOAps4mk2Zob89MWXMHjHWg9milF/j4osnnQLXBCBFBk/tvIG/tUc9mOUJiPBhPXA=="
}, },
"cookie": { "cookie": {
"version": "0.4.0", "version": "0.4.2",
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.4.0.tgz", "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.4.2.tgz",
"integrity": "sha512-+Hp8fLp57wnUSt0tY0tHEXh4voZRDnoIrZPqlo3DPiI4y9lwg/jqx+1Om94/W6ZaPDOUbnjOt/99w66zk+l1Xg==" "integrity": "sha512-aSWTXFzaKWkvHO1Ny/s+ePFpvKsPnjc551iI41v3ny/ow6tBG5Vd+FuqGNhh1LxOmVzOlGUriIlOaokOvhaStA=="
}, },
"cookie-signature": { "cookie-signature": {
"version": "1.0.6", "version": "1.0.6",
@ -690,7 +707,7 @@
"destroy": { "destroy": {
"version": "1.0.4", "version": "1.0.4",
"resolved": "https://registry.npmjs.org/destroy/-/destroy-1.0.4.tgz", "resolved": "https://registry.npmjs.org/destroy/-/destroy-1.0.4.tgz",
"integrity": "sha1-l4hXRCxEdJ5CBmE+N5RiBYJqvYA=" "integrity": "sha512-3NdhDuEXnfun/z7x9GOElY49LoqVHoGScmOKwmxhsS8N5Y+Z8KyPPDnaSzqWgYt/ji4mqwfTS34Htrk0zPIXVg=="
}, },
"ee-first": { "ee-first": {
"version": "1.1.1", "version": "1.1.1",
@ -710,19 +727,19 @@
"etag": { "etag": {
"version": "1.8.1", "version": "1.8.1",
"resolved": "https://registry.npmjs.org/etag/-/etag-1.8.1.tgz", "resolved": "https://registry.npmjs.org/etag/-/etag-1.8.1.tgz",
"integrity": "sha1-Qa4u62XvpiJorr/qg6x9eSmbCIc=" "integrity": "sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg=="
}, },
"express": { "express": {
"version": "4.17.1", "version": "4.17.3",
"resolved": "https://registry.npmjs.org/express/-/express-4.17.1.tgz", "resolved": "https://registry.npmjs.org/express/-/express-4.17.3.tgz",
"integrity": "sha512-mHJ9O79RqluphRrcw2X/GTh3k9tVv8YcoyY4Kkh4WDMUYKRZUq0h1o0w2rrrxBqM7VoeUVqgb27xlEMXTnYt4g==", "integrity": "sha512-yuSQpz5I+Ch7gFrPCk4/c+dIBKlQUxtgwqzph132bsT6qhuzss6I8cLJQz7B3rFblzd6wtcI0ZbGltH/C4LjUg==",
"requires": { "requires": {
"accepts": "~1.3.7", "accepts": "~1.3.8",
"array-flatten": "1.1.1", "array-flatten": "1.1.1",
"body-parser": "1.19.0", "body-parser": "1.19.2",
"content-disposition": "0.5.3", "content-disposition": "0.5.4",
"content-type": "~1.0.4", "content-type": "~1.0.4",
"cookie": "0.4.0", "cookie": "0.4.2",
"cookie-signature": "1.0.6", "cookie-signature": "1.0.6",
"debug": "2.6.9", "debug": "2.6.9",
"depd": "~1.1.2", "depd": "~1.1.2",
@ -736,13 +753,13 @@
"on-finished": "~2.3.0", "on-finished": "~2.3.0",
"parseurl": "~1.3.3", "parseurl": "~1.3.3",
"path-to-regexp": "0.1.7", "path-to-regexp": "0.1.7",
"proxy-addr": "~2.0.5", "proxy-addr": "~2.0.7",
"qs": "6.7.0", "qs": "6.9.7",
"range-parser": "~1.2.1", "range-parser": "~1.2.1",
"safe-buffer": "5.1.2", "safe-buffer": "5.2.1",
"send": "0.17.1", "send": "0.17.2",
"serve-static": "1.14.1", "serve-static": "1.14.2",
"setprototypeof": "1.1.1", "setprototypeof": "1.2.0",
"statuses": "~1.5.0", "statuses": "~1.5.0",
"type-is": "~1.6.18", "type-is": "~1.6.18",
"utils-merge": "1.0.1", "utils-merge": "1.0.1",
@ -772,25 +789,25 @@
} }
}, },
"forwarded": { "forwarded": {
"version": "0.1.2", "version": "0.2.0",
"resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.1.2.tgz", "resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
"integrity": "sha1-mMI9qxF1ZXuMBXPozszZGw/xjIQ=" "integrity": "sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow=="
}, },
"fresh": { "fresh": {
"version": "0.5.2", "version": "0.5.2",
"resolved": "https://registry.npmjs.org/fresh/-/fresh-0.5.2.tgz", "resolved": "https://registry.npmjs.org/fresh/-/fresh-0.5.2.tgz",
"integrity": "sha1-PYyt2Q2XZWn6g1qx+OSyOhBWBac=" "integrity": "sha512-zJ2mQYM18rEFOudeV4GShTGIQ7RbzA7ozbU9I/XBpm7kqgMywgmylMwXHxZJmkVoYkna9d2pVXVXPdYTP9ej8Q=="
}, },
"http-errors": { "http-errors": {
"version": "1.7.2", "version": "1.8.1",
"resolved": "https://registry.npmjs.org/http-errors/-/http-errors-1.7.2.tgz", "resolved": "https://registry.npmjs.org/http-errors/-/http-errors-1.8.1.tgz",
"integrity": "sha512-uUQBt3H/cSIVfch6i1EuPNy/YsRSOUBXTVfZ+yR7Zjez3qjBz6i9+i4zjNaoqcoFVI4lQJ5plg63TvGfRSDCRg==", "integrity": "sha512-Kpk9Sm7NmI+RHhnj6OIWDI1d6fIoFAtFt9RLaTMRlg/8w49juAStsrBgp0Dp4OdxdVbRIeKhtCUvoi/RuAhO4g==",
"requires": { "requires": {
"depd": "~1.1.2", "depd": "~1.1.2",
"inherits": "2.0.3", "inherits": "2.0.4",
"setprototypeof": "1.1.1", "setprototypeof": "1.2.0",
"statuses": ">= 1.5.0 < 2", "statuses": ">= 1.5.0 < 2",
"toidentifier": "1.0.0" "toidentifier": "1.0.1"
} }
}, },
"iconv-lite": { "iconv-lite": {
@ -802,9 +819,9 @@
} }
}, },
"inherits": { "inherits": {
"version": "2.0.3", "version": "2.0.4",
"resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.3.tgz", "resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz",
"integrity": "sha1-Yzwsg+PaQqUC9SRmAiSA9CCCYd4=" "integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ=="
}, },
"ipaddr.js": { "ipaddr.js": {
"version": "1.9.1", "version": "1.9.1",
@ -850,16 +867,16 @@
"integrity": "sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==" "integrity": "sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg=="
}, },
"mime-db": { "mime-db": {
"version": "1.46.0", "version": "1.52.0",
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.46.0.tgz", "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
"integrity": "sha512-svXaP8UQRZ5K7or+ZmfNhg2xX3yKDMUzqadsSqi4NCH/KomcH75MAMYAGVlvXn4+b/xOPhS3I2uHKRUzvjY7BQ==" "integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg=="
}, },
"mime-types": { "mime-types": {
"version": "2.1.29", "version": "2.1.35",
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.29.tgz", "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
"integrity": "sha512-Y/jMt/S5sR9OaqteJtslsFZKWOIIqMACsJSiHghlCAyhf7jfVYjKBmLiX8OgpWeW+fjJ2b+Az69aPFPkUOY6xQ==", "integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
"requires": { "requires": {
"mime-db": "1.46.0" "mime-db": "1.52.0"
} }
}, },
"ms": { "ms": {
@ -868,9 +885,9 @@
"integrity": "sha1-VgiurfwAvmwpAd9fmGF4jeDVl8g=" "integrity": "sha1-VgiurfwAvmwpAd9fmGF4jeDVl8g="
}, },
"negotiator": { "negotiator": {
"version": "0.6.2", "version": "0.6.3",
"resolved": "https://registry.npmjs.org/negotiator/-/negotiator-0.6.2.tgz", "resolved": "https://registry.npmjs.org/negotiator/-/negotiator-0.6.3.tgz",
"integrity": "sha512-hZXc7K2e+PgeI1eDBe/10Ard4ekbfrrqG8Ep+8Jmf4JID2bNg7NvCPOZN+kfF574pFQI7mum2AUqDidoKqcTOw==" "integrity": "sha512-+EUsqGPLsM+j/zdChZjsnX51g4XrHFOIXwfnCVPGlQk/k5giakcKsuxCObBRu6DSm9opw/O6slWbJdghQM4bBg=="
}, },
"nocamel": { "nocamel": {
"version": "git+ssh://git@github.com/HexF/nocamel.git#89a5bfbbd07c72c302d968b967d0f4fe54846544", "version": "git+ssh://git@github.com/HexF/nocamel.git#89a5bfbbd07c72c302d968b967d0f4fe54846544",
@ -900,18 +917,18 @@
"integrity": "sha1-32BBeABfUi8V60SQ5yR6G/qmf4w=" "integrity": "sha1-32BBeABfUi8V60SQ5yR6G/qmf4w="
}, },
"proxy-addr": { "proxy-addr": {
"version": "2.0.6", "version": "2.0.7",
"resolved": "https://registry.npmjs.org/proxy-addr/-/proxy-addr-2.0.6.tgz", "resolved": "https://registry.npmjs.org/proxy-addr/-/proxy-addr-2.0.7.tgz",
"integrity": "sha512-dh/frvCBVmSsDYzw6n926jv974gddhkFPfiN8hPOi30Wax25QZyZEGveluCgliBnqmuM+UJmBErbAUFIoDbjOw==", "integrity": "sha512-llQsMLSUDUPT44jdrU/O37qlnifitDP+ZwrmmZcoSKyLKvtZxpyV0n2/bD/N4tBAAZ/gJEdZU7KMraoK1+XYAg==",
"requires": { "requires": {
"forwarded": "~0.1.2", "forwarded": "0.2.0",
"ipaddr.js": "1.9.1" "ipaddr.js": "1.9.1"
} }
}, },
"qs": { "qs": {
"version": "6.7.0", "version": "6.9.7",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.7.0.tgz", "resolved": "https://registry.npmjs.org/qs/-/qs-6.9.7.tgz",
"integrity": "sha512-VCdBRNFTX1fyE7Nb6FYoURo/SPe62QCaAyzJvUjwRaIsc+NePBEniHlvxFmmX56+HZphIGtV0XeCirBtpDrTyQ==" "integrity": "sha512-IhMFgUmuNpyRfxA90umL7ByLlgRXu6tIfKPpF5TmcfRLlLCckfP/g3IQmju6jjpu+Hh8rA+2p6A27ZSPOOHdKw=="
}, },
"range-parser": { "range-parser": {
"version": "1.2.1", "version": "1.2.1",
@ -919,20 +936,20 @@
"integrity": "sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==" "integrity": "sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg=="
}, },
"raw-body": { "raw-body": {
"version": "2.4.0", "version": "2.4.3",
"resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.4.0.tgz", "resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.4.3.tgz",
"integrity": "sha512-4Oz8DUIwdvoa5qMJelxipzi/iJIi40O5cGV1wNYp5hvZP8ZN0T+jiNkL0QepXs+EsQ9XJ8ipEDoiH70ySUJP3Q==", "integrity": "sha512-UlTNLIcu0uzb4D2f4WltY6cVjLi+/jEN4lgEUj3E04tpMDpUlkBo/eSn6zou9hum2VMNpCCUone0O0WeJim07g==",
"requires": { "requires": {
"bytes": "3.1.0", "bytes": "3.1.2",
"http-errors": "1.7.2", "http-errors": "1.8.1",
"iconv-lite": "0.4.24", "iconv-lite": "0.4.24",
"unpipe": "1.0.0" "unpipe": "1.0.0"
} }
}, },
"safe-buffer": { "safe-buffer": {
"version": "5.1.2", "version": "5.2.1",
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz", "resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz",
"integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==" "integrity": "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ=="
}, },
"safer-buffer": { "safer-buffer": {
"version": "2.1.2", "version": "2.1.2",
@ -948,9 +965,9 @@
} }
}, },
"send": { "send": {
"version": "0.17.1", "version": "0.17.2",
"resolved": "https://registry.npmjs.org/send/-/send-0.17.1.tgz", "resolved": "https://registry.npmjs.org/send/-/send-0.17.2.tgz",
"integrity": "sha512-BsVKsiGcQMFwT8UxypobUKyv7irCNRHk1T0G680vk88yf6LBByGcZJOTJCrTP2xVN6yI+XjPJcNuE3V4fT9sAg==", "integrity": "sha512-UJYB6wFSJE3G00nEivR5rgWp8c2xXvJ3OPWPhmuteU0IKj8nKbG3DrjiOmLwpnHGYWAVwA69zmTm++YG0Hmwww==",
"requires": { "requires": {
"debug": "2.6.9", "debug": "2.6.9",
"depd": "~1.1.2", "depd": "~1.1.2",
@ -959,36 +976,36 @@
"escape-html": "~1.0.3", "escape-html": "~1.0.3",
"etag": "~1.8.1", "etag": "~1.8.1",
"fresh": "0.5.2", "fresh": "0.5.2",
"http-errors": "~1.7.2", "http-errors": "1.8.1",
"mime": "1.6.0", "mime": "1.6.0",
"ms": "2.1.1", "ms": "2.1.3",
"on-finished": "~2.3.0", "on-finished": "~2.3.0",
"range-parser": "~1.2.1", "range-parser": "~1.2.1",
"statuses": "~1.5.0" "statuses": "~1.5.0"
}, },
"dependencies": { "dependencies": {
"ms": { "ms": {
"version": "2.1.1", "version": "2.1.3",
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.1.tgz", "resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
"integrity": "sha512-tgp+dl5cGk28utYktBsrFqA7HKgrhgPsg6Z/EfhWI4gl1Hwq8B/GmY/0oXZ6nF8hDVesS/FpnYaD/kOWhYQvyg==" "integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="
} }
} }
}, },
"serve-static": { "serve-static": {
"version": "1.14.1", "version": "1.14.2",
"resolved": "https://registry.npmjs.org/serve-static/-/serve-static-1.14.1.tgz", "resolved": "https://registry.npmjs.org/serve-static/-/serve-static-1.14.2.tgz",
"integrity": "sha512-JMrvUwE54emCYWlTI+hGrGv5I8dEwmco/00EvkzIIsR7MqrHonbD9pO2MOfFnpFntl7ecpZs+3mW+XbQZu9QCg==", "integrity": "sha512-+TMNA9AFxUEGuC0z2mevogSnn9MXKb4fa7ngeRMJaaGv8vTwnIEkKi+QGvPt33HSnf8pRS+WGM0EbMtCJLKMBQ==",
"requires": { "requires": {
"encodeurl": "~1.0.2", "encodeurl": "~1.0.2",
"escape-html": "~1.0.3", "escape-html": "~1.0.3",
"parseurl": "~1.3.3", "parseurl": "~1.3.3",
"send": "0.17.1" "send": "0.17.2"
} }
}, },
"setprototypeof": { "setprototypeof": {
"version": "1.1.1", "version": "1.2.0",
"resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.1.1.tgz", "resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.2.0.tgz",
"integrity": "sha512-JvdAWfbXeIGaZ9cILp38HntZSFSo3mWg6xGcJJsd+d4aRMOqauag1C63dJfDw7OaMYwEbHMOxEZ1lqVRYP2OAw==" "integrity": "sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw=="
}, },
"statuses": { "statuses": {
"version": "1.5.0", "version": "1.5.0",
@ -996,9 +1013,9 @@
"integrity": "sha1-Fhx9rBd2Wf2YEfQ3cfqZOBR4Yow=" "integrity": "sha1-Fhx9rBd2Wf2YEfQ3cfqZOBR4Yow="
}, },
"toidentifier": { "toidentifier": {
"version": "1.0.0", "version": "1.0.1",
"resolved": "https://registry.npmjs.org/toidentifier/-/toidentifier-1.0.0.tgz", "resolved": "https://registry.npmjs.org/toidentifier/-/toidentifier-1.0.1.tgz",
"integrity": "sha512-yaOH/Pk/VEhBWWTlhI+qXxDFXlejDGcQipMlyxda9nthulaxLZUNcUqFxokp0vcYnvteJln5FNQDRrxj3YcbVw==" "integrity": "sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA=="
}, },
"type-is": { "type-is": {
"version": "1.6.18", "version": "1.6.18",

View file

@ -1,12 +1,12 @@
{ {
"name": "piston-api", "name": "piston-api",
"version": "3.1.0", "version": "3.1.1",
"description": "API for piston - a high performance code execution engine", "description": "API for piston - a high performance code execution engine",
"main": "src/index.js", "main": "src/index.js",
"dependencies": { "dependencies": {
"body-parser": "^1.19.0", "body-parser": "^1.19.0",
"chownr": "^2.0.0", "chownr": "^2.0.0",
"express": "^4.17.1", "express": "^4.17.3",
"express-ws": "^5.0.2", "express-ws": "^5.0.2",
"is-docker": "^2.1.1", "is-docker": "^2.1.1",
"logplease": "^1.2.15", "logplease": "^1.2.15",

View file

@ -6,50 +6,9 @@ const events = require('events');
const runtime = require('../runtime'); const runtime = require('../runtime');
const { Job } = require('../job'); const { Job } = require('../job');
const package = require('../package'); const package = require('../package');
const globals = require('../globals');
const logger = require('logplease').create('api/v2'); const logger = require('logplease').create('api/v2');
const SIGNALS = [
'SIGABRT',
'SIGALRM',
'SIGBUS',
'SIGCHLD',
'SIGCLD',
'SIGCONT',
'SIGEMT',
'SIGFPE',
'SIGHUP',
'SIGILL',
'SIGINFO',
'SIGINT',
'SIGIO',
'SIGIOT',
'SIGKILL',
'SIGLOST',
'SIGPIPE',
'SIGPOLL',
'SIGPROF',
'SIGPWR',
'SIGQUIT',
'SIGSEGV',
'SIGSTKFLT',
'SIGSTOP',
'SIGTSTP',
'SIGSYS',
'SIGTERM',
'SIGTRAP',
'SIGTTIN',
'SIGTTOU',
'SIGUNUSED',
'SIGURG',
'SIGUSR1',
'SIGUSR2',
'SIGVTALRM',
'SIGXCPU',
'SIGXFSZ',
'SIGWINCH',
];
// ref: https://man7.org/linux/man-pages/man7/signal.7.html
function get_job(body) { function get_job(body) {
let { let {
language, language,
@ -61,6 +20,8 @@ function get_job(body) {
run_memory_limit, run_memory_limit,
run_timeout, run_timeout,
compile_timeout, compile_timeout,
run_cpu_time,
compile_cpu_time,
} = body; } = body;
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
@ -106,7 +67,7 @@ function get_job(body) {
}); });
} }
for (const constraint of ['memory_limit', 'timeout']) { for (const constraint of ['memory_limit', 'timeout', 'cpu_time']) {
for (const type of ['compile', 'run']) { for (const type of ['compile', 'run']) {
const constraint_name = `${type}_${constraint}`; const constraint_name = `${type}_${constraint}`;
const constraint_value = body[constraint_name]; const constraint_value = body[constraint_name];
@ -135,23 +96,23 @@ function get_job(body) {
} }
} }
compile_timeout = compile_timeout || rt.timeouts.compile;
run_timeout = run_timeout || rt.timeouts.run;
compile_memory_limit = compile_memory_limit || rt.memory_limits.compile;
run_memory_limit = run_memory_limit || rt.memory_limits.run;
resolve( resolve(
new Job({ new Job({
runtime: rt, runtime: rt,
args: args || [], args: args ?? [],
stdin: stdin || '', stdin: stdin ?? '',
files, files,
timeouts: { timeouts: {
run: run_timeout, run: run_timeout ?? rt.timeouts.run,
compile: compile_timeout, compile: compile_timeout ?? rt.timeouts.compile,
},
cpu_times: {
run: run_cpu_time ?? rt.cpu_times.run,
compile: compile_cpu_time ?? rt.cpu_times.compile,
}, },
memory_limits: { memory_limits: {
run: run_memory_limit, run: run_memory_limit ?? rt.memory_limits.run,
compile: compile_memory_limit, compile: compile_memory_limit ?? rt.memory_limits.compile,
}, },
}) })
); );
@ -163,7 +124,7 @@ router.use((req, res, next) => {
return next(); return next();
} }
if (!req.headers['content-type'].startsWith('application/json')) { if (!req.headers['content-type']?.startsWith('application/json')) {
return res.status(415).send({ return res.status(415).send({
message: 'requests must be of type application/json', message: 'requests must be of type application/json',
}); });
@ -174,9 +135,9 @@ router.use((req, res, next) => {
router.ws('/connect', async (ws, req) => { router.ws('/connect', async (ws, req) => {
let job = null; let job = null;
let eventBus = new events.EventEmitter(); let event_bus = new events.EventEmitter();
eventBus.on('stdout', data => event_bus.on('stdout', data =>
ws.send( ws.send(
JSON.stringify({ JSON.stringify({
type: 'data', type: 'data',
@ -185,7 +146,7 @@ router.ws('/connect', async (ws, req) => {
}) })
) )
); );
eventBus.on('stderr', data => event_bus.on('stderr', data =>
ws.send( ws.send(
JSON.stringify({ JSON.stringify({
type: 'data', type: 'data',
@ -194,10 +155,10 @@ router.ws('/connect', async (ws, req) => {
}) })
) )
); );
eventBus.on('stage', stage => event_bus.on('stage', stage =>
ws.send(JSON.stringify({ type: 'stage', stage })) ws.send(JSON.stringify({ type: 'stage', stage }))
); );
eventBus.on('exit', (stage, status) => event_bus.on('exit', (stage, status) =>
ws.send(JSON.stringify({ type: 'exit', stage, ...status })) ws.send(JSON.stringify({ type: 'exit', stage, ...status }))
); );
@ -210,19 +171,27 @@ router.ws('/connect', async (ws, req) => {
if (job === null) { if (job === null) {
job = await get_job(msg); job = await get_job(msg);
await job.prime(); try {
const box = await job.prime();
ws.send( ws.send(
JSON.stringify({ JSON.stringify({
type: 'runtime', type: 'runtime',
language: job.runtime.language, language: job.runtime.language,
version: job.runtime.version.raw, version: job.runtime.version.raw,
}) })
); );
await job.execute_interactive(eventBus); await job.execute(box, event_bus);
} catch (error) {
ws.close(4999, 'Job Completed'); logger.error(
`Error cleaning up job: ${job.uuid}:\n${error}`
);
throw error;
} finally {
await job.cleanup();
}
ws.close(4999, 'Job Completed'); // Will not execute if an error is thrown above
} else { } else {
ws.close(4000, 'Already Initialized'); ws.close(4000, 'Already Initialized');
} }
@ -230,7 +199,7 @@ router.ws('/connect', async (ws, req) => {
case 'data': case 'data':
if (job !== null) { if (job !== null) {
if (msg.stream === 'stdin') { if (msg.stream === 'stdin') {
eventBus.emit('stdin', msg.data); event_bus.emit('stdin', msg.data);
} else { } else {
ws.close(4004, 'Can only write to stdin'); ws.close(4004, 'Can only write to stdin');
} }
@ -240,8 +209,10 @@ router.ws('/connect', async (ws, req) => {
break; break;
case 'signal': case 'signal':
if (job !== null) { if (job !== null) {
if (SIGNALS.includes(msg.signal)) { if (
eventBus.emit('signal', msg.signal); Object.values(globals.SIGNALS).includes(msg.signal)
) {
event_bus.emit('signal', msg.signal);
} else { } else {
ws.close(4005, 'Invalid signal'); ws.close(4005, 'Invalid signal');
} }
@ -257,12 +228,6 @@ router.ws('/connect', async (ws, req) => {
} }
}); });
ws.on('close', async () => {
if (job !== null) {
await job.cleanup();
}
});
setTimeout(() => { setTimeout(() => {
//Terminate the socket after 1 second, if not initialized. //Terminate the socket after 1 second, if not initialized.
if (job === null) ws.close(4001, 'Initialization Timeout'); if (job === null) ws.close(4001, 'Initialization Timeout');
@ -270,18 +235,32 @@ router.ws('/connect', async (ws, req) => {
}); });
router.post('/execute', async (req, res) => { router.post('/execute', async (req, res) => {
let job;
try { try {
const job = await get_job(req.body); job = await get_job(req.body);
} catch (error) {
return res.status(400).json(error);
}
try {
const box = await job.prime();
await job.prime(); let result = await job.execute(box);
// Backward compatibility when the run stage is not started
const result = await job.execute(); if (result.run === undefined) {
result.run = result.compile;
await job.cleanup(); }
return res.status(200).send(result); return res.status(200).send(result);
} catch (error) { } catch (error) {
return res.status(400).json(error); logger.error(`Error executing job: ${job.uuid}:\n${error}`);
return res.status(500).send();
} finally {
try {
await job.cleanup(); // This gets executed before the returns in try/catch
} catch (error) {
logger.error(`Error cleaning up job: ${job.uuid}:\n${error}`);
return res.status(500).send(); // On error, this replaces the return in the outer try-catch
}
} }
}); });

View file

@ -90,6 +90,18 @@ const options = {
parser: parse_int, parser: parse_int,
validators: [(x, raw) => !is_nan(x) || `${raw} is not a number`], validators: [(x, raw) => !is_nan(x) || `${raw} is not a number`],
}, },
compile_cpu_time: {
desc: 'Max CPU time allowed for compile stage in milliseconds',
default: 10000, // 10 seconds
parser: parse_int,
validators: [(x, raw) => !is_nan(x) || `${raw} is not a number`],
},
run_cpu_time: {
desc: 'Max CPU time allowed for run stage in milliseconds',
default: 3000, // 3 seconds
parser: parse_int,
validators: [(x, raw) => !is_nan(x) || `${raw} is not a number`],
},
compile_memory_limit: { compile_memory_limit: {
desc: 'Max memory usage for compile stage in bytes (set to -1 for no limit)', desc: 'Max memory usage for compile stage in bytes (set to -1 for no limit)',
default: -1, // no limit default: -1, // no limit
@ -117,7 +129,7 @@ const options = {
limit_overrides: { limit_overrides: {
desc: 'Per-language exceptions in JSON format for each of:\ desc: 'Per-language exceptions in JSON format for each of:\
max_process_count, max_open_files, max_file_size, compile_memory_limit,\ max_process_count, max_open_files, max_file_size, compile_memory_limit,\
run_memory_limit, compile_timeout, run_timeout, output_max_size', run_memory_limit, compile_timeout, run_timeout, compile_cpu_time, run_cpu_time, output_max_size',
default: {}, default: {},
parser: parse_overrides, parser: parse_overrides,
validators: [ validators: [
@ -165,6 +177,8 @@ function parse_overrides(overrides_string) {
'run_memory_limit', 'run_memory_limit',
'compile_timeout', 'compile_timeout',
'run_timeout', 'run_timeout',
'compile_cpu_time',
'run_cpu_time',
'output_max_size', 'output_max_size',
].includes(key) ].includes(key)
) { ) {

29
api/src/docker-entrypoint.sh Executable file
View file

@ -0,0 +1,29 @@
#!/bin/bash
CGROUP_FS="/sys/fs/cgroup"
if [ ! -e "$CGROUP_FS" ]; then
echo "Cannot find $CGROUP_FS. Please make sure your system is using cgroup v2"
exit 1
fi
if [ -e "$CGROUP_FS/unified" ]; then
echo "Combined cgroup v1+v2 mode is not supported. Please make sure your system is using pure cgroup v2"
exit 1
fi
if [ ! -e "$CGROUP_FS/cgroup.subtree_control" ]; then
echo "Cgroup v2 not found. Please make sure cgroup v2 is enabled on your system"
exit 1
fi
cd /sys/fs/cgroup && \
mkdir isolate/ && \
echo 1 > isolate/cgroup.procs && \
echo '+cpuset +cpu +io +memory +pids' > cgroup.subtree_control && \
cd isolate && \
mkdir init && \
echo 1 > init/cgroup.procs && \
echo '+cpuset +memory' > cgroup.subtree_control && \
echo "Initialized cgroup" && \
chown -R piston:piston /piston && \
exec su -- piston -c 'ulimit -n 65536 && node /piston_api/src'

View file

@ -7,14 +7,78 @@ const platform = `${is_docker() ? 'docker' : 'baremetal'}-${fs
.split('\n') .split('\n')
.find(x => x.startsWith('ID')) .find(x => x.startsWith('ID'))
.replace('ID=', '')}`; .replace('ID=', '')}`;
const SIGNALS = {
1: 'SIGHUP',
2: 'SIGINT',
3: 'SIGQUIT',
4: 'SIGILL',
5: 'SIGTRAP',
6: 'SIGABRT',
7: 'SIGBUS',
8: 'SIGFPE',
9: 'SIGKILL',
10: 'SIGUSR1',
11: 'SIGSEGV',
12: 'SIGUSR2',
13: 'SIGPIPE',
14: 'SIGALRM',
15: 'SIGTERM',
16: 'SIGSTKFLT',
17: 'SIGCHLD',
18: 'SIGCONT',
19: 'SIGSTOP',
20: 'SIGTSTP',
21: 'SIGTTIN',
22: 'SIGTTOU',
23: 'SIGURG',
24: 'SIGXCPU',
25: 'SIGXFSZ',
26: 'SIGVTALRM',
27: 'SIGPROF',
28: 'SIGWINCH',
29: 'SIGIO',
30: 'SIGPWR',
31: 'SIGSYS',
34: 'SIGRTMIN',
35: 'SIGRTMIN+1',
36: 'SIGRTMIN+2',
37: 'SIGRTMIN+3',
38: 'SIGRTMIN+4',
39: 'SIGRTMIN+5',
40: 'SIGRTMIN+6',
41: 'SIGRTMIN+7',
42: 'SIGRTMIN+8',
43: 'SIGRTMIN+9',
44: 'SIGRTMIN+10',
45: 'SIGRTMIN+11',
46: 'SIGRTMIN+12',
47: 'SIGRTMIN+13',
48: 'SIGRTMIN+14',
49: 'SIGRTMIN+15',
50: 'SIGRTMAX-14',
51: 'SIGRTMAX-13',
52: 'SIGRTMAX-12',
53: 'SIGRTMAX-11',
54: 'SIGRTMAX-10',
55: 'SIGRTMAX-9',
56: 'SIGRTMAX-8',
57: 'SIGRTMAX-7',
58: 'SIGRTMAX-6',
59: 'SIGRTMAX-5',
60: 'SIGRTMAX-4',
61: 'SIGRTMAX-3',
62: 'SIGRTMAX-2',
63: 'SIGRTMAX-1',
64: 'SIGRTMAX',
};
module.exports = { module.exports = {
data_directories: { data_directories: {
packages: 'packages', packages: 'packages',
jobs: 'jobs',
}, },
version: require('../package.json').version, version: require('../package.json').version,
platform, platform,
pkg_installed_file: '.ppman-installed', //Used as indication for if a package was installed pkg_installed_file: '.ppman-installed', //Used as indication for if a package was installed
clean_directories: ['/dev/shm', '/run/lock', '/tmp', '/var/tmp'], clean_directories: ['/dev/shm', '/run/lock', '/tmp', '/var/tmp'],
SIGNALS,
}; };

View file

@ -35,7 +35,6 @@ expressWs(app);
} }
} }
}); });
fss.chmodSync(path.join(config.data_directory, globals.data_directories.jobs), 0o711)
logger.info('Loading packages'); logger.info('Loading packages');
const pkgdir = path.join( const pkgdir = path.join(
@ -79,6 +78,12 @@ expressWs(app);
const api_v2 = require('./api/v2'); const api_v2 = require('./api/v2');
app.use('/api/v2', api_v2); app.use('/api/v2', api_v2);
const { version } = require('../package.json');
app.get('/', (req, res, next) => {
return res.status(200).send({ message: `Piston v${version}` });
});
app.use((req, res, next) => { app.use((req, res, next) => {
return res.status(404).send({ message: 'Not Found' }); return res.status(404).send({ message: 'Not Found' });
}); });
@ -86,7 +91,12 @@ expressWs(app);
logger.debug('Calling app.listen'); logger.debug('Calling app.listen');
const [address, port] = config.bind_address.split(':'); const [address, port] = config.bind_address.split(':');
app.listen(port, address, () => { const server = app.listen(port, address, () => {
logger.info('API server started on', config.bind_address); logger.info('API server started on', config.bind_address);
}); });
process.on('SIGTERM', () => {
server.close();
process.exit(0)
});
})(); })();

View file

@ -1,13 +1,10 @@
const logplease = require('logplease'); const logplease = require('logplease');
const logger = logplease.create('job');
const { v4: uuidv4 } = require('uuid'); const { v4: uuidv4 } = require('uuid');
const cp = require('child_process'); const cp = require('child_process');
const path = require('path'); const path = require('path');
const config = require('./config'); const config = require('./config');
const globals = require('./globals');
const fs = require('fs/promises'); const fs = require('fs/promises');
const fss = require('fs'); const globals = require('./globals');
const wait_pid = require('waitpid');
const job_states = { const job_states = {
READY: Symbol('Ready to be primed'), READY: Symbol('Ready to be primed'),
@ -15,21 +12,26 @@ const job_states = {
EXECUTED: Symbol('Executed and ready for cleanup'), EXECUTED: Symbol('Executed and ready for cleanup'),
}; };
let uid = 0; const MAX_BOX_ID = 999;
let gid = 0; const ISOLATE_PATH = '/usr/local/bin/isolate';
let box_id = 0;
let remaining_job_spaces = config.max_concurrent_jobs; let remaining_job_spaces = config.max_concurrent_jobs;
let jobQueue = []; let job_queue = [];
setInterval(() => { const get_next_box_id = () => ++box_id % MAX_BOX_ID;
// Every 10ms try resolve a new job, if there is an available slot
if (jobQueue.length > 0 && remaining_job_spaces > 0) {
jobQueue.shift()();
}
}, 10);
class Job { class Job {
constructor({ runtime, files, args, stdin, timeouts, memory_limits }) { #dirty_boxes;
constructor({
runtime,
files,
args,
stdin,
timeouts,
cpu_times,
memory_limits,
}) {
this.uuid = uuidv4(); this.uuid = uuidv4();
this.logger = logplease.create(`job/${this.uuid}`); this.logger = logplease.create(`job/${this.uuid}`);
@ -45,187 +47,282 @@ class Job {
this.args = args; this.args = args;
this.stdin = stdin; this.stdin = stdin;
// Add a trailing newline if it doesn't exist
if (this.stdin.slice(-1) !== '\n') {
this.stdin += '\n';
}
this.timeouts = timeouts; this.timeouts = timeouts;
this.cpu_times = cpu_times;
this.memory_limits = memory_limits; this.memory_limits = memory_limits;
this.uid = config.runner_uid_min + uid;
this.gid = config.runner_gid_min + gid;
uid++;
gid++;
uid %= config.runner_uid_max - config.runner_uid_min + 1;
gid %= config.runner_gid_max - config.runner_gid_min + 1;
this.logger.debug(`Assigned uid=${this.uid} gid=${this.gid}`);
this.state = job_states.READY; this.state = job_states.READY;
this.dir = path.join( this.#dirty_boxes = [];
config.data_directory, }
globals.data_directories.jobs,
this.uuid async #create_isolate_box() {
); const box_id = get_next_box_id();
const metadata_file_path = `/tmp/${box_id}-metadata.txt`;
return new Promise((res, rej) => {
cp.exec(
`isolate --init --cg -b${box_id}`,
(error, stdout, stderr) => {
if (error) {
rej(
`Failed to run isolate --init: ${error.message}\nstdout: ${stdout}\nstderr: ${stderr}`
);
}
if (stdout === '') {
rej('Received empty stdout from isolate --init');
}
const box = {
id: box_id,
metadata_file_path,
dir: `${stdout.trim()}/box`,
};
this.#dirty_boxes.push(box);
res(box);
}
);
});
} }
async prime() { async prime() {
if (remaining_job_spaces < 1) { if (remaining_job_spaces < 1) {
this.logger.info(`Awaiting job slot`); this.logger.info(`Awaiting job slot`);
await new Promise(resolve => { await new Promise(resolve => {
jobQueue.push(resolve); job_queue.push(resolve);
}); });
} }
this.logger.info(`Priming job`); this.logger.info(`Priming job`);
remaining_job_spaces--; remaining_job_spaces--;
this.logger.debug('Writing files to job cache'); this.logger.debug('Running isolate --init');
const box = await this.#create_isolate_box();
this.logger.debug(`Transfering ownership`);
await fs.mkdir(this.dir, { mode: 0o700 });
await fs.chown(this.dir, this.uid, this.gid);
this.logger.debug(`Creating submission files in Isolate box`);
const submission_dir = path.join(box.dir, 'submission');
await fs.mkdir(submission_dir);
for (const file of this.files) { for (const file of this.files) {
const file_path = path.join(this.dir, file.name); const file_path = path.join(submission_dir, file.name);
const rel = path.relative(this.dir, file_path); const rel = path.relative(submission_dir, file_path);
const file_content = Buffer.from(file.content, file.encoding);
if (rel.startsWith('..')) if (rel.startsWith('..'))
throw Error( throw Error(
`File path "${file.name}" tries to escape parent directory: ${rel}` `File path "${file.name}" tries to escape parent directory: ${rel}`
); );
const file_content = Buffer.from(file.content, file.encoding);
await fs.mkdir(path.dirname(file_path), { await fs.mkdir(path.dirname(file_path), {
recursive: true, recursive: true,
mode: 0o700, mode: 0o700,
}); });
await fs.chown(path.dirname(file_path), this.uid, this.gid);
await fs.write_file(file_path, file_content); await fs.write_file(file_path, file_content);
await fs.chown(file_path, this.uid, this.gid);
} }
this.state = job_states.PRIMED; this.state = job_states.PRIMED;
this.logger.debug('Primed job'); this.logger.debug('Primed job');
return box;
} }
async safe_call(file, args, timeout, memory_limit, eventBus = null) { async safe_call(
return new Promise((resolve, reject) => { box,
const nonetwork = config.disable_networking ? ['nosocket'] : []; file,
args,
timeout,
cpu_time,
memory_limit,
event_bus = null
) {
let stdout = '';
let stderr = '';
let output = '';
let memory = null;
let code = null;
let signal = null;
let message = null;
let status = null;
let cpu_time_stat = null;
let wall_time_stat = null;
const prlimit = [ const proc = cp.spawn(
'prlimit', ISOLATE_PATH,
'--nproc=' + this.runtime.max_process_count, [
'--nofile=' + this.runtime.max_open_files, '--run',
'--fsize=' + this.runtime.max_file_size, `-b${box.id}`,
]; `--meta=${box.metadata_file_path}`,
'--cg',
const timeout_call = [ '-s',
'timeout', '-s', '9', Math.ceil(timeout / 1000), '-c',
]; '/box/submission',
'-E',
if (memory_limit >= 0) { 'HOME=/tmp',
prlimit.push('--as=' + memory_limit); ...this.runtime.env_vars.flat_map(v => ['-E', v]),
} '-E',
`PISTON_LANGUAGE=${this.runtime.language}`,
const proc_call = [ `--dir=${this.runtime.pkgdir}`,
'nice', `--dir=/etc:noexec`,
...timeout_call, `--processes=${this.runtime.max_process_count}`,
...prlimit, `--open-files=${this.runtime.max_open_files}`,
...nonetwork, `--fsize=${Math.floor(this.runtime.max_file_size / 1000)}`,
'bash', `--wall-time=${timeout / 1000}`,
file, `--time=${cpu_time / 1000}`,
`--extra-time=0`,
...(memory_limit >= 0
? [`--cg-mem=${Math.floor(memory_limit / 1000)}`]
: []),
...(config.disable_networking ? [] : ['--share-net']),
'--',
'/bin/bash',
path.join(this.runtime.pkgdir, file),
...args, ...args,
]; ],
{
var stdout = '';
var stderr = '';
var output = '';
const proc = cp.spawn(proc_call[0], proc_call.splice(1), {
env: {
...this.runtime.env_vars,
PISTON_LANGUAGE: this.runtime.language,
},
stdio: 'pipe', stdio: 'pipe',
cwd: this.dir,
uid: this.uid,
gid: this.gid,
detached: true, //give this process its own process group
});
if (eventBus === null) {
proc.stdin.write(this.stdin);
proc.stdin.end();
proc.stdin.destroy();
} else {
eventBus.on('stdin', data => {
proc.stdin.write(data);
});
eventBus.on('kill', signal => {
proc.kill(signal);
});
} }
);
const kill_timeout = if (event_bus === null) {
(timeout >= 0 && proc.stdin.write(this.stdin);
set_timeout(async _ => { proc.stdin.end();
this.logger.info(`Timeout exceeded timeout=${timeout}`); proc.stdin.destroy();
process.kill(proc.pid, 'SIGKILL'); } else {
}, timeout)) || event_bus.on('stdin', data => {
null; proc.stdin.write(data);
proc.stderr.on('data', async data => {
if (eventBus !== null) {
eventBus.emit('stderr', data);
} else if (stderr.length > this.runtime.output_max_size) {
this.logger.info(`stderr length exceeded`);
process.kill(proc.pid, 'SIGKILL');
} else {
stderr += data;
output += data;
}
}); });
proc.stdout.on('data', async data => { event_bus.on('kill', signal => {
if (eventBus !== null) { proc.kill(signal);
eventBus.emit('stdout', data);
} else if (stdout.length > this.runtime.output_max_size) {
this.logger.info(`stdout length exceeded`);
process.kill(proc.pid, 'SIGKILL');
} else {
stdout += data;
output += data;
}
}); });
}
const exit_cleanup = () => { proc.stderr.on('data', async data => {
clear_timeout(kill_timeout); if (event_bus !== null) {
event_bus.emit('stderr', data);
} else if (
stderr.length + data.length >
this.runtime.output_max_size
) {
message = 'stderr length exceeded';
status = 'EL';
this.logger.info(message);
try {
process.kill(proc.pid, 'SIGABRT');
} catch (e) {
// Could already be dead and just needs to be waited on
this.logger.debug(
`Got error while SIGABRTing process ${proc}:`,
e
);
}
} else {
stderr += data;
output += data;
}
});
proc.stderr.destroy(); proc.stdout.on('data', async data => {
proc.stdout.destroy(); if (event_bus !== null) {
event_bus.emit('stdout', data);
} else if (
stdout.length + data.length >
this.runtime.output_max_size
) {
message = 'stdout length exceeded';
status = 'OL';
this.logger.info(message);
try {
process.kill(proc.pid, 'SIGABRT');
} catch (e) {
// Could already be dead and just needs to be waited on
this.logger.debug(
`Got error while SIGABRTing process ${proc}:`,
e
);
}
} else {
stdout += data;
output += data;
}
});
this.cleanup_processes(); const data = await new Promise((res, rej) => {
this.logger.debug(`Finished exit cleanup`); proc.on('exit', (_, signal) => {
}; res({
signal,
proc.on('exit', (code, signal) => { });
exit_cleanup();
resolve({ stdout, stderr, code, signal, output });
}); });
proc.on('error', err => { proc.on('error', err => {
exit_cleanup(); rej({
error: err,
reject({ error: err, stdout, stderr, output }); });
}); });
}); });
try {
const metadata_str = (
await fs.read_file(box.metadata_file_path)
).toString();
const metadata_lines = metadata_str.split('\n');
for (const line of metadata_lines) {
if (!line) continue;
const [key, value] = line.split(':');
if (key === undefined || value === undefined) {
throw new Error(
`Failed to parse metadata file, received: ${line}`
);
}
switch (key) {
case 'cg-mem':
memory = parse_int(value) * 1000;
break;
case 'exitcode':
code = parse_int(value);
break;
case 'exitsig':
signal = globals.SIGNALS[parse_int(value)] ?? null;
break;
case 'message':
message = message || value;
break;
case 'status':
status = status || value;
break;
case 'time':
cpu_time_stat = parse_float(value) * 1000;
break;
case 'time-wall':
wall_time_stat = parse_float(value) * 1000;
break;
default:
break;
}
}
} catch (e) {
throw new Error(
`Error reading metadata file: ${box.metadata_file_path}\nError: ${e.message}\nIsolate run stdout: ${stdout}\nIsolate run stderr: ${stderr}`
);
}
return {
...data,
stdout,
stderr,
code,
signal: ['TO', 'OL', 'EL'].includes(status) ? 'SIGKILL' : signal,
output,
memory,
message,
status,
cpu_time: cpu_time_stat,
wall_time: wall_time_stat,
};
} }
async execute() { async execute(box, event_bus = null) {
if (this.state !== job_states.PRIMED) { if (this.state !== job_states.PRIMED) {
throw new Error( throw new Error(
'Job must be in primed state, current state: ' + 'Job must be in primed state, current state: ' +
@ -242,24 +339,66 @@ class Job {
this.logger.debug('Compiling'); this.logger.debug('Compiling');
let compile; let compile;
let compile_errored = false;
const { emit_event_bus_result, emit_event_bus_stage } =
event_bus === null
? {
emit_event_bus_result: () => {},
emit_event_bus_stage: () => {},
}
: {
emit_event_bus_result: (stage, result) => {
const { error, code, signal } = result;
event_bus.emit('exit', stage, {
error,
code,
signal,
});
},
emit_event_bus_stage: stage => {
event_bus.emit('stage', stage);
},
};
if (this.runtime.compiled) { if (this.runtime.compiled) {
this.logger.debug('Compiling');
emit_event_bus_stage('compile');
compile = await this.safe_call( compile = await this.safe_call(
path.join(this.runtime.pkgdir, 'compile'), box,
'compile',
code_files.map(x => x.name), code_files.map(x => x.name),
this.timeouts.compile, this.timeouts.compile,
this.memory_limits.compile this.cpu_times.compile,
this.memory_limits.compile,
event_bus
); );
emit_event_bus_result('compile', compile);
compile_errored = compile.code !== 0;
if (!compile_errored) {
const old_box_dir = box.dir;
box = await this.#create_isolate_box();
await fs.rename(
path.join(old_box_dir, 'submission'),
path.join(box.dir, 'submission')
);
}
} }
this.logger.debug('Running'); let run;
if (!compile_errored) {
const run = await this.safe_call( this.logger.debug('Running');
path.join(this.runtime.pkgdir, 'run'), emit_event_bus_stage('run');
[code_files[0].name, ...this.args], run = await this.safe_call(
this.timeouts.run, box,
this.memory_limits.run 'run',
); [code_files[0].name, ...this.args],
this.timeouts.run,
this.cpu_times.run,
this.memory_limits.run,
event_bus
);
emit_event_bus_result('run', run);
}
this.state = job_states.EXECUTED; this.state = job_states.EXECUTED;
@ -271,179 +410,34 @@ class Job {
}; };
} }
async execute_interactive(eventBus) {
if (this.state !== job_states.PRIMED) {
throw new Error(
'Job must be in primed state, current state: ' +
this.state.toString()
);
}
this.logger.info(
`Interactively executing job runtime=${this.runtime.toString()}`
);
const code_files =
(this.runtime.language === 'file' && this.files) ||
this.files.filter(file => file.encoding == 'utf8');
if (this.runtime.compiled) {
eventBus.emit('stage', 'compile');
const { error, code, signal } = await this.safe_call(
path.join(this.runtime.pkgdir, 'compile'),
code_files.map(x => x.name),
this.timeouts.compile,
this.memory_limits.compile,
eventBus
);
eventBus.emit('exit', 'compile', { error, code, signal });
}
this.logger.debug('Running');
eventBus.emit('stage', 'run');
const { error, code, signal } = await this.safe_call(
path.join(this.runtime.pkgdir, 'run'),
[code_files[0].name, ...this.args],
this.timeouts.run,
this.memory_limits.run,
eventBus
);
eventBus.emit('exit', 'run', { error, code, signal });
this.state = job_states.EXECUTED;
}
cleanup_processes(dont_wait = []) {
let processes = [1];
const to_wait = [];
this.logger.debug(`Cleaning up processes`);
while (processes.length > 0) {
processes = [];
const proc_ids = fss.readdir_sync('/proc');
processes = proc_ids.map(proc_id => {
if (isNaN(proc_id)) return -1;
try {
const proc_status = fss.read_file_sync(
path.join('/proc', proc_id, 'status')
);
const proc_lines = proc_status.to_string().split('\n');
const state_line = proc_lines.find(line =>
line.starts_with('State:')
);
const uid_line = proc_lines.find(line =>
line.starts_with('Uid:')
);
const [_, ruid, euid, suid, fuid] = uid_line.split(/\s+/);
const [_1, state, user_friendly] = state_line.split(/\s+/);
const proc_id_int = parse_int(proc_id);
// Skip over any processes that aren't ours.
if(ruid != this.uid && euid != this.uid) return -1;
if (state == 'Z'){
// Zombie process, just needs to be waited, regardless of the user id
if(!to_wait.includes(proc_id_int))
to_wait.push(proc_id_int);
return -1;
}
// We should kill in all other state (Sleep, Stopped & Running)
return proc_id_int;
} catch {
return -1;
}
return -1;
});
processes = processes.filter(p => p > 0);
if (processes.length > 0)
this.logger.debug(`Got processes to kill: ${processes}`);
for (const proc of processes) {
// First stop the processes, but keep their resources allocated so they cant re-fork
try {
process.kill(proc, 'SIGSTOP');
} catch (e) {
// Could already be dead
this.logger.debug(
`Got error while SIGSTOPping process ${proc}:`,
e
);
}
}
for (const proc of processes) {
// Then clear them out of the process tree
try {
process.kill(proc, 'SIGKILL');
} catch(e) {
// Could already be dead and just needs to be waited on
this.logger.debug(
`Got error while SIGKILLing process ${proc}:`,
e
);
}
to_wait.push(proc);
}
}
this.logger.debug(
`Finished kill-loop, calling wait_pid to end any zombie processes`
);
for (const proc of to_wait) {
if (dont_wait.includes(proc)) continue;
wait_pid(proc);
}
this.logger.debug(`Cleaned up processes`);
}
async cleanup_filesystem() {
for (const clean_path of globals.clean_directories) {
const contents = await fs.readdir(clean_path);
for (const file of contents) {
const file_path = path.join(clean_path, file);
try {
const stat = await fs.stat(file_path);
if (stat.uid === this.uid) {
await fs.rm(file_path, {
recursive: true,
force: true,
});
}
} catch (e) {
// File was somehow deleted in the time that we read the dir to when we checked the file
this.logger.warn(`Error removing file ${file_path}: ${e}`);
}
}
}
await fs.rm(this.dir, { recursive: true, force: true });
}
async cleanup() { async cleanup() {
this.logger.info(`Cleaning up job`); this.logger.info(`Cleaning up job`);
this.cleanup_processes(); // Run process janitor, just incase there are any residual processes somehow
await this.cleanup_filesystem();
remaining_job_spaces++; remaining_job_spaces++;
if (job_queue.length > 0) {
job_queue.shift()();
}
await Promise.all(
this.#dirty_boxes.map(async box => {
cp.exec(
`isolate --cleanup --cg -b${box.id}`,
(error, stdout, stderr) => {
if (error) {
this.logger.error(
`Failed to run isolate --cleanup: ${error.message} on box #${box.id}\nstdout: ${stdout}\nstderr: ${stderr}`
);
}
}
);
try {
await fs.rm(box.metadata_file_path);
} catch (e) {
this.logger.error(
`Failed to remove the metadata directory of box #${box.id}. Error: ${e.message}`
);
}
})
);
} }
} }

View file

@ -1,19 +0,0 @@
CC = gcc
CFLAGS = -O2 -Wall -lseccomp
TARGET = nosocket
BUILD_PATH = ./
INSTALL_PATH = /usr/local/bin/
SOURCE = nosocket.c
all: $(TARGET)
$(TARGET): $(SOURCE)
$(CC) $(BUILD_PATH)$(SOURCE) $(CFLAGS) -o $(TARGET)
install:
mv $(TARGET) $(INSTALL_PATH)
clean:
$(RM) $(TARGET)
$(RM) $(INSTALL_PATH)$(TARGET)

View file

@ -1,62 +0,0 @@
/*
nosocket.c
Disables access to the `socket` syscall and runs a program provided as the first
commandline argument.
*/
#include <stdio.h>
#include <errno.h>
#include <unistd.h>
#include <sys/prctl.h>
#include <seccomp.h>
int main(int argc, char *argv[])
{
// Disallow any new capabilities from being added
prctl(PR_SET_NO_NEW_PRIVS, 1, 0, 0, 0);
// SCMP_ACT_ALLOW lets the filter have no effect on syscalls not matching a
// configured filter rule (allow all by default)
scmp_filter_ctx ctx = seccomp_init(SCMP_ACT_ALLOW);
if (!ctx)
{
fprintf(stderr, "Unable to initialize seccomp filter context\n");
return 1;
}
// Add 32 bit and 64 bit architectures to seccomp filter
int rc;
uint32_t arch[] = {SCMP_ARCH_X86_64, SCMP_ARCH_X86, SCMP_ARCH_X32};
// We first remove the existing arch, otherwise our subsequent call to add
// it will fail
seccomp_arch_remove(ctx, seccomp_arch_native());
for (int i = 0; i < sizeof(arch) / sizeof(arch[0]); i++)
{
rc = seccomp_arch_add(ctx, arch[i]);
if (rc != 0)
{
fprintf(stderr, "Unable to add arch: %d\n", arch[i]);
return 1;
}
}
// Add a seccomp rule to the syscall blacklist - blacklist the socket syscall
if (seccomp_rule_add(ctx, SCMP_ACT_ERRNO(EACCES), SCMP_SYS(socket), 0) < 0)
{
fprintf(stderr, "Unable to add seccomp rule to context\n");
return 1;
}
#ifdef DEBUG
seccomp_export_pfc(ctx, 0);
#endif
if (argc < 2)
{
fprintf(stderr, "Usage %s: %s <program name> <arguments>\n", argv[0], argv[0]);
return 1;
}
seccomp_load(ctx);
execvp(argv[1], argv + 1);
return 1;
}

View file

@ -145,7 +145,11 @@ class Package {
await fs.write_file(path.join(this.install_path, '.env'), filtered_env); await fs.write_file(path.join(this.install_path, '.env'), filtered_env);
logger.debug('Changing Ownership of package directory'); logger.debug('Changing Ownership of package directory');
await util.promisify(chownr)(this.install_path, 0, 0); await util.promisify(chownr)(
this.install_path,
process.getuid(),
process.getgid()
);
logger.debug('Writing installed state to disk'); logger.debug('Writing installed state to disk');
await fs.write_file( await fs.write_file(

View file

@ -15,6 +15,7 @@ class Runtime {
pkgdir, pkgdir,
runtime, runtime,
timeouts, timeouts,
cpu_times,
memory_limits, memory_limits,
max_process_count, max_process_count,
max_open_files, max_open_files,
@ -27,6 +28,7 @@ class Runtime {
this.pkgdir = pkgdir; this.pkgdir = pkgdir;
this.runtime = runtime; this.runtime = runtime;
this.timeouts = timeouts; this.timeouts = timeouts;
this.cpu_times = cpu_times;
this.memory_limits = memory_limits; this.memory_limits = memory_limits;
this.max_process_count = max_process_count; this.max_process_count = max_process_count;
this.max_open_files = max_open_files; this.max_open_files = max_open_files;
@ -62,6 +64,18 @@ class Runtime {
language_limit_overrides language_limit_overrides
), ),
}, },
cpu_times: {
compile: this.compute_single_limit(
language_name,
'compile_cpu_time',
language_limit_overrides
),
run: this.compute_single_limit(
language_name,
'run_cpu_time',
language_limit_overrides
),
},
memory_limits: { memory_limits: {
compile: this.compute_single_limit( compile: this.compute_single_limit(
language_name, language_name,
@ -164,15 +178,7 @@ class Runtime {
const env_file = path.join(this.pkgdir, '.env'); const env_file = path.join(this.pkgdir, '.env');
const env_content = fss.read_file_sync(env_file).toString(); const env_content = fss.read_file_sync(env_file).toString();
this._env_vars = {}; this._env_vars = env_content.trim().split('\n');
env_content
.trim()
.split('\n')
.map(line => line.split('=', 2))
.forEach(([key, val]) => {
this._env_vars[key.trim()] = val.trim();
});
} }
return this._env_vars; return this._env_vars;

View file

@ -23,8 +23,8 @@ fetch_packages(){
mkdir build mkdir build
# Start a piston container # Start a piston container
docker run \ docker run \
--privileged \
-v "$PWD/build":'/piston/packages' \ -v "$PWD/build":'/piston/packages' \
--tmpfs /piston/jobs \
-dit \ -dit \
-p $port:2000 \ -p $port:2000 \
--name builder_piston_instance \ --name builder_piston_instance \
@ -61,4 +61,4 @@ fetch_packages $SPEC_FILE
build_container $TAG build_container $TAG
echo "Start your custom piston container with" echo "Start your custom piston container with"
echo "$ docker run --tmpfs /piston/jobs -dit -p 2000:2000 $TAG" echo "$ docker run --privileged -dit -p 2000:2000 $TAG"

View file

@ -4,8 +4,7 @@ services:
api: api:
build: api build: api
container_name: piston_api container_name: piston_api
cap_add: privileged: true
- CAP_SYS_ADMIN
restart: always restart: always
ports: ports:
- 2000:2000 - 2000:2000
@ -13,8 +12,6 @@ services:
- ./data/piston/packages:/piston/packages - ./data/piston/packages:/piston/packages
environment: environment:
- PISTON_REPO_URL=http://repo:8000/index - PISTON_REPO_URL=http://repo:8000/index
tmpfs:
- /piston/jobs:exec,uid=1000,gid=1000,mode=711
repo: # Local testing of packages repo: # Local testing of packages
build: repo build: repo

View file

@ -5,10 +5,10 @@ services:
image: ghcr.io/engineer-man/piston image: ghcr.io/engineer-man/piston
container_name: piston_api container_name: piston_api
restart: always restart: always
privileged: true
ports: ports:
- 2000:2000 - 2000:2000
volumes: volumes:
- ./data/piston/packages:/piston/packages - ./data/piston/packages:/piston/packages
tmpfs: tmpfs:
- /piston/jobs:exec,uid=1000,gid=1000,mode=711
- /tmp:exec - /tmp:exec

View file

@ -135,8 +135,21 @@ key:
default: 3000 default: 3000
``` ```
The maximum time that is allowed to be taken by a stage in milliseconds. The maximum time that is allowed to be taken by a stage in milliseconds. This is the wall-time of the stage. The time that the CPU does not spend working on the stage (e.g, due to context switches or IO) is counted.
Use -1 for unlimited time.
## Compile/Run CPU-Time
```yaml
key:
- PISTON_COMPILE_CPU_TIME
default: 10000
key:
- PISTON_RUN_CPU_TIME
default: 3000
```
The maximum CPU-time that is allowed to be consumed by a stage in milliseconds. The time that the CPU does not spend working on the stage (e.g, IO and context switches) is not counted. This option is typically used in algorithm contests.
## Compile/Run memory limits ## Compile/Run memory limits
@ -178,7 +191,7 @@ default: {}
``` ```
Per-language overrides/exceptions for the each of `max_process_count`, `max_open_files`, `max_file_size`, Per-language overrides/exceptions for the each of `max_process_count`, `max_open_files`, `max_file_size`,
`compile_memory_limit`, `run_memory_limit`, `compile_timeout`, `run_timeout`, `output_max_size`. Defined as follows: `compile_memory_limit`, `run_memory_limit`, `compile_timeout`, `run_timeout`, `compile_cpu_time`, `run_cpu_time`, `output_max_size`. Defined as follows:
``` ```
PISTON_LIMIT_OVERRIDES={"c++":{"max_process_count":128}} PISTON_LIMIT_OVERRIDES={"c++":{"max_process_count":128}}

9
packages/MATL/22.7.4/build.sh vendored Normal file
View file

@ -0,0 +1,9 @@
#!/usr/bin/env bash
# build octave as dependency
source ../../octave/6.2.0/build.sh
# curl MATL 22.7.4
curl -L "https://github.com/lmendo/MATL/archive/refs/tags/22.7.4.tar.gz" -o MATL.tar.xz
tar xf MATL.tar.xz --strip-components=1
rm MATL.tar.xz

5
packages/MATL/22.7.4/environment vendored Normal file
View file

@ -0,0 +1,5 @@
#!/usr/bin/env bash
# Path to MATL binary
export PATH=$PWD/bin:$PATH
export MATL_PATH=$PWD

5
packages/MATL/22.7.4/metadata.json vendored Normal file
View file

@ -0,0 +1,5 @@
{
"language": "matl",
"version": "22.7.4",
"aliases": []
}

13
packages/MATL/22.7.4/run vendored Normal file
View file

@ -0,0 +1,13 @@
#!/usr/bin/env bash
# get file as first argument
file="$1"
# remove the file from $@
shift
# use the rest of the arguments as stdin
stdin=`printf "%s\n" "$@"`
# pass stdin into octave which will run MATL
echo "$stdin" | octave -W -p "$MATL_PATH" --eval "matl -of '$file'"

1
packages/MATL/22.7.4/test.matl vendored Normal file
View file

@ -0,0 +1 @@
'OK'

20
packages/bash/5.2.0/build.sh vendored Executable file
View file

@ -0,0 +1,20 @@
#!/usr/bin/env bash
# Put instructions to build your package in here
PREFIX=$(realpath $(dirname $0))
mkdir -p build
cd build
curl "https://ftp.gnu.org/gnu/bash/bash-5.2.tar.gz" -o bash.tar.gz
tar xzf bash.tar.gz --strip-components=1
# === autoconf based ===
./configure --prefix "$PREFIX"
make -j$(nproc)
make install -j$(nproc)
cd ../
rm -rf build

4
packages/bash/5.2.0/environment vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Put 'export' statements here for environment variables
export PATH=$PWD/bin:$PATH

5
packages/bash/5.2.0/metadata.json vendored Normal file
View file

@ -0,0 +1,5 @@
{
"language": "bash",
"version": "5.2.0",
"aliases": ["sh"]
}

4
packages/bash/5.2.0/run vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Put instructions to run the runtime
bash "$@"

1
packages/bash/5.2.0/test.bash.sh vendored Normal file
View file

@ -0,0 +1 @@
echo "OK"

View file

@ -2,6 +2,6 @@
PREFIX=$(realpath $(dirname $0)) PREFIX=$(realpath $(dirname $0))
curl -L "https://github.com/crystal-lang/crystal/releases/download/0.36.1/crystal-0.36.1-1-linux-x86_64.tar.gz" -o crystal.tar.gz curl -L "https://github.com/crystal-lang/crystal/releases/download/1.9.2/crystal-1.9.2-1-linux-x86_64.tar.gz" -o crystal.tar.gz
tar xzf crystal.tar.gz --strip-components=1 tar xzf crystal.tar.gz --strip-components=1
rm crystal.tar.gz rm crystal.tar.gz

View file

@ -1,5 +1,5 @@
{ {
"language": "crystal", "language": "crystal",
"version": "0.36.1", "version": "1.9.2",
"aliases": ["crystal", "cr"] "aliases": ["crystal", "cr"]
} }

11
packages/dart/2.19.6/build.sh vendored Normal file
View file

@ -0,0 +1,11 @@
#!/usr/bin/env bash
curl -L "https://storage.googleapis.com/dart-archive/channels/stable/release/2.19.6/sdk/dartsdk-linux-x64-release.zip" -o dart.zip
unzip dart.zip
rm dart.zip
cp -r dart-sdk/* .
rm -rf dart-sdk
chmod -R +rx bin

4
packages/dart/2.19.6/environment vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Put 'export' statements here for environment variables
export PATH=$PWD/bin:$PATH

5
packages/dart/2.19.6/metadata.json vendored Normal file
View file

@ -0,0 +1,5 @@
{
"language": "dart",
"version": "2.19.6",
"aliases": []
}

4
packages/dart/2.19.6/run vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Put instructions to run the runtime
dart run "$@"

3
packages/dart/2.19.6/test.dart vendored Normal file
View file

@ -0,0 +1,3 @@
void main() {
print('OK');
}

11
packages/dart/3.0.1/build.sh vendored Executable file
View file

@ -0,0 +1,11 @@
#!/usr/bin/env bash
curl -L "https://storage.googleapis.com/dart-archive/channels/stable/release/3.0.1/sdk/dartsdk-linux-x64-release.zip" -o dart.zip
unzip dart.zip
rm dart.zip
cp -r dart-sdk/* .
rm -rf dart-sdk
chmod -R +rx bin

4
packages/dart/3.0.1/environment vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Put 'export' statements here for environment variables
export PATH=$PWD/bin:$PATH

5
packages/dart/3.0.1/metadata.json vendored Normal file
View file

@ -0,0 +1,5 @@
{
"language": "dart",
"version": "3.0.1",
"aliases": []
}

4
packages/dart/3.0.1/run vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Put instructions to run the runtime
dart run "$@"

3
packages/dart/3.0.1/test.dart vendored Normal file
View file

@ -0,0 +1,3 @@
void main() {
print('OK');
}

View file

@ -1,3 +1,3 @@
#!/bin/bash #!/bin/bash
DENO_DIR=$PWD deno run $@ DENO_DIR=$PWD NO_COLOR=true deno run $@

5
packages/deno/1.32.3/build.sh vendored Normal file
View file

@ -0,0 +1,5 @@
#!/bin/bash
curl -OL https://github.com/denoland/deno/releases/download/v1.32.3/deno-x86_64-unknown-linux-gnu.zip
unzip -o deno-x86_64-unknown-linux-gnu.zip
rm deno-x86_64-unknown-linux-gnu.zip

3
packages/deno/1.32.3/environment vendored Normal file
View file

@ -0,0 +1,3 @@
#!/bin/bash
export PATH=$PWD:$PATH

20
packages/deno/1.32.3/metadata.json vendored Normal file
View file

@ -0,0 +1,20 @@
{
"language": "deno",
"version": "1.32.3",
"dependencies": {},
"provides": [
{
"language": "typescript",
"aliases": [
"deno",
"deno-ts"
]
},
{
"language": "javascript",
"aliases": [
"deno-js"
]
}
]
}

3
packages/deno/1.32.3/run vendored Normal file
View file

@ -0,0 +1,3 @@
#!/bin/bash
DENO_DIR=$PWD NO_COLOR=true deno run $@

1
packages/deno/1.32.3/test.deno.ts vendored Normal file
View file

@ -0,0 +1 @@
console.log("OK")

View file

@ -1,2 +1,2 @@
#!/bin/bash #!/bin/bash
DENO_DIR=$PWD deno run "$@" DENO_DIR=$PWD NO_COLOR=true deno run "$@"

5
packages/freebasic/1.9.0/build.sh vendored Normal file
View file

@ -0,0 +1,5 @@
#!/usr/bin/env bash
curl -L "https://sourceforge.net/projects/fbc/files/FreeBASIC-1.09.0/Binaries-Linux/FreeBASIC-1.09.0-linux-x86_64.tar.gz/download" -o freebasic.tar.gz
tar xf freebasic.tar.gz --strip-components=1
rm freebasic.tar.gz

4
packages/freebasic/1.9.0/compile vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Compile bas files
fbc -lang qb -b "$@" -x out

4
packages/freebasic/1.9.0/environment vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Path to fbc compiler
export PATH=$PWD/bin:$PATH

View file

@ -0,0 +1,5 @@
{
"language": "freebasic",
"version": "1.9.0",
"aliases": ["bas", "fbc", "basic", "qbasic", "quickbasic"]
}

5
packages/freebasic/1.9.0/run vendored Normal file
View file

@ -0,0 +1,5 @@
#!/usr/bin/env bash
# Run output file from compile with arguments
shift
./out "$@"

1
packages/freebasic/1.9.0/test.bas vendored Normal file
View file

@ -0,0 +1 @@
PRINT "OK"

21
packages/julia/1.8.5/build.sh vendored Normal file
View file

@ -0,0 +1,21 @@
#!/usr/bin/env bash
# Install location
PREFIX=$(realpath $(dirname $0))
mkdir -p build
cd build
# Download and extract Julia source
curl -L "https://github.com/JuliaLang/julia/releases/download/v1.8.5/julia-1.8.5.tar.gz" -o julia.tar.gz
tar xzf julia.tar.gz --strip-components=1
# Build
echo "JULIA_CPU_TARGET=generic;sandybridge,-xsaveopt,clone_all;haswell,-rdrnd,base(1)
prefix=$PREFIX" > Make.user
make -j$(nproc)
make install -j$(nproc)
# Cleanup
cd ..
rm -rf build

4
packages/julia/1.8.5/environment vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Add Julia binary to path
export PATH=$PWD/bin:$PATH

5
packages/julia/1.8.5/metadata.json vendored Normal file
View file

@ -0,0 +1,5 @@
{
"language": "julia",
"version": "1.8.5",
"aliases": ["jl"]
}

4
packages/julia/1.8.5/run vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Run without startup or history file
julia --startup-file=no --history-file=no "$@"

1
packages/julia/1.8.5/test.jl vendored Normal file
View file

@ -0,0 +1 @@
println("OK")

9
packages/k/1.0.0/build.sh vendored Normal file
View file

@ -0,0 +1,9 @@
#!/usr/bin/env bash
set -e
git clone "https://codeberg.org/ngn/k" k
cd k
git checkout 544d014afd8dd84b18c2011cabd3aa3d76571ca3
make CC=gcc

5
packages/k/1.0.0/environment vendored Normal file
View file

@ -0,0 +1,5 @@
#!/usr/bin/env bash
# k path
export PATH=$PWD/bin:$PATH
export K_PATH=$PWD/k

5
packages/k/1.0.0/metadata.json vendored Normal file
View file

@ -0,0 +1,5 @@
{
"language": "k",
"version": "1.0.0",
"aliases": ["ngnk"]
}

3
packages/k/1.0.0/run vendored Normal file
View file

@ -0,0 +1,3 @@
#!/usr/bin/env bash
$K_PATH/k "$@"

1
packages/k/1.0.0/test.k vendored Normal file
View file

@ -0,0 +1 @@
`0:`c$2/'((1 0 0 1 1 1 1);(1 0 0 1 0 1 1))

13
packages/kotlin/1.8.20/build.sh vendored Normal file
View file

@ -0,0 +1,13 @@
#!/usr/bin/env bash
# Download and extract JDK8
curl -L "https://github.com/AdoptOpenJDK/openjdk8-binaries/releases/download/jdk8u292-b10/OpenJDK8U-jdk_x64_linux_hotspot_8u292b10.tar.gz" -o jdk.tar.gz
tar xzf jdk.tar.gz --strip-components=1
rm jdk.tar.gz
# Download and extract Kotlin
curl -L "https://github.com/JetBrains/kotlin/releases/download/v1.8.20/kotlin-compiler-1.8.20.zip" -o kotlin.zip
unzip kotlin.zip
rm kotlin.zip
cp -r kotlinc/* .
rm -rf kotlinc

6
packages/kotlin/1.8.20/compile vendored Normal file
View file

@ -0,0 +1,6 @@
#!/usr/bin/env bash
rename 's/$/\.kt/' "$@" # Add .kt extension
# Compile Kotlin code to a jar file
kotlinc *.kt -include-runtime -d code.jar

4
packages/kotlin/1.8.20/environment vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Add java and kotlinc to path
export PATH=$PWD/bin:$PATH

5
packages/kotlin/1.8.20/metadata.json vendored Normal file
View file

@ -0,0 +1,5 @@
{
"language": "kotlin",
"version": "1.8.20",
"aliases": ["kt"]
}

5
packages/kotlin/1.8.20/run vendored Normal file
View file

@ -0,0 +1,5 @@
#!/usr/bin/env bash
# Run jar file
shift
java -jar code.jar "$@"

3
packages/kotlin/1.8.20/test.kt vendored Normal file
View file

@ -0,0 +1,3 @@
fun main() {
println("OK")
}

14
packages/lua/5.4.4/build.sh vendored Normal file
View file

@ -0,0 +1,14 @@
#!/usr/bin/env bash
# Put instructions to build your package in here
curl -R -O -L http://www.lua.org/ftp/lua-5.4.4.tar.gz
tar zxf lua-5.4.4.tar.gz
rm lua-5.4.4.tar.gz
cd lua-5.4.4
# Building Lua
make linux
# To check that Lua has been built correctly
make test
# Installing Lua
make linux install

4
packages/lua/5.4.4/environment vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Put 'export' statements here for environment variables
export PATH="$PWD/lua-5.4.4/src:$PATH"

5
packages/lua/5.4.4/metadata.json vendored Normal file
View file

@ -0,0 +1,5 @@
{
"language": "lua",
"version": "5.4.4",
"aliases": []
}

4
packages/lua/5.4.4/run vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Put instructions to run the runtime
lua "$@"

1
packages/lua/5.4.4/test.lua vendored Normal file
View file

@ -0,0 +1 @@
print("OK")

View file

@ -5,7 +5,7 @@ PREFIX=$(realpath $(dirname $0))
mkdir -p build/mono build/mono-basic mkdir -p build/mono build/mono-basic
cd build cd build
curl "https://download.mono-project.com/sources/mono/mono-6.12.0.122.tar.xz" -o mono.tar.xz curl "https://download.mono-project.com/sources/mono/mono-6.12.0.182.tar.xz" -o mono.tar.xz
curl -L "https://github.com/mono/mono-basic/archive/refs/tags/4.7.tar.gz" -o mono-basic.tar.gz curl -L "https://github.com/mono/mono-basic/archive/refs/tags/4.7.tar.gz" -o mono-basic.tar.gz
tar xf mono.tar.xz --strip-components=1 -C mono tar xf mono.tar.xz --strip-components=1 -C mono
tar xf mono-basic.tar.gz --strip-components=1 -C mono-basic tar xf mono-basic.tar.gz --strip-components=1 -C mono-basic

18
packages/nim/1.6.2/build.sh vendored Normal file
View file

@ -0,0 +1,18 @@
#!/bin/bash
PREFIX=$(realpath $(dirname $0))
mkdir -p build
cd build
# Prebuilt binary - source *can* be built, but it requires gcc
curl -L "https://nim-lang.org/download/nim-1.6.2-linux_x64.tar.xz" -o nim.tar.xz
tar xf nim.tar.xz --strip-components=1
rm nim.tar.xz
./install.sh "$PREFIX"
cd ../
rm -rf build

5
packages/nim/1.6.2/compile vendored Normal file
View file

@ -0,0 +1,5 @@
#!/usr/bin/env bash
# Compile nim file(s)
nim --hints:off --out:out --nimcache:./ c "$@"
chmod +x out

1
packages/nim/1.6.2/environment vendored Normal file
View file

@ -0,0 +1 @@
export PATH=$PWD/nim/bin:$PATH

5
packages/nim/1.6.2/metadata.json vendored Normal file
View file

@ -0,0 +1,5 @@
{
"language": "nim",
"version": "1.6.2",
"aliases": []
}

4
packages/nim/1.6.2/run vendored Normal file
View file

@ -0,0 +1,4 @@
#!/bin/bash
shift # Filename is only used to compile
./out "$@"

1
packages/nim/1.6.2/test.nim vendored Normal file
View file

@ -0,0 +1 @@
echo("OK")

4
packages/node/18.15.0/build.sh vendored Normal file
View file

@ -0,0 +1,4 @@
#!/bin/bash
curl "https://nodejs.org/dist/v18.15.0/node-v18.15.0-linux-x64.tar.xz" -o node.tar.xz
tar xf node.tar.xz --strip-components=1
rm node.tar.xz

1
packages/node/18.15.0/environment vendored Normal file
View file

@ -0,0 +1 @@
export PATH=$PWD/bin:$PATH

10
packages/node/18.15.0/metadata.json vendored Normal file
View file

@ -0,0 +1,10 @@
{
"language": "node",
"version": "18.15.0",
"provides": [
{
"language": "javascript",
"aliases": ["node-javascript", "node-js", "javascript", "js"]
}
]
}

3
packages/node/18.15.0/run vendored Normal file
View file

@ -0,0 +1,3 @@
#!/bin/bash
node "$@"

1
packages/node/18.15.0/test.js vendored Normal file
View file

@ -0,0 +1 @@
console.log('OK');

4
packages/node/20.11.1/build.sh vendored Normal file
View file

@ -0,0 +1,4 @@
#!/bin/bash
curl "https://nodejs.org/dist/v20.11.1/node-v20.11.1-linux-x64.tar.xz" -o node.tar.xz
tar xf node.tar.xz --strip-components=1
rm node.tar.xz

1
packages/node/20.11.1/environment vendored Normal file
View file

@ -0,0 +1 @@
export PATH=$PWD/bin:$PATH

10
packages/node/20.11.1/metadata.json vendored Normal file
View file

@ -0,0 +1,10 @@
{
"language": "node",
"version": "20.11.1",
"provides": [
{
"language": "javascript",
"aliases": ["node-javascript", "node-js", "javascript", "js"]
}
]
}

3
packages/node/20.11.1/run vendored Normal file
View file

@ -0,0 +1,3 @@
#!/bin/bash
node "$@"

1
packages/node/20.11.1/test.js vendored Normal file
View file

@ -0,0 +1 @@
console.log('OK');

22
packages/octave/8.1.0/build.sh vendored Normal file
View file

@ -0,0 +1,22 @@
#!/usr/bin/env bash
# Build octave from source
PREFIX=$(realpath $(dirname $0))
mkdir -p build
cd build
curl -L "https://ftpmirror.gnu.org/octave/octave-8.1.0.tar.gz" -o octave.tar.gz
tar xzf octave.tar.gz --strip-components=1
# === autoconf based ===
# Disable support for GUI, HDF5 and Java
./configure --prefix "$PREFIX" --without-opengl --without-qt --without-x --without-hdf5 --disable-java
make -j$(nproc)
make install -j$(nproc)
cd ../
rm -rf build

4
packages/octave/8.1.0/environment vendored Normal file
View file

@ -0,0 +1,4 @@
#!/usr/bin/env bash
# Path to octave binary
export PATH=$PWD/bin:$PATH

Some files were not shown because too many files have changed in this diff Show more