-
Notifications
You must be signed in to change notification settings - Fork 100
SeBS Cloudflare Compatibility #274
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
userlaurin
wants to merge
175
commits into
spcl:master
Choose a base branch
from
userlaurin:master
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from 69 commits
Commits
Show all changes
175 commits
Select commit
Hold shift + click to select a range
9a9d42e
initial sebs cloudflare infra, functions, config, triggers. readme in…
aa24a07
systems.json cloudflare config
MisterMM23 4cc0476
highly incomplete work on benchmark wrappers, using R2 and KV.
cd24fcf
wrappers - changes to handler and storage - can now run benchmark 110…
eaa42a1
just some changes. storage still not properly tested...
57452fa
concept for r2 storage
MisterMM23 9e47e0f
translated wrapper to js
822a9d9
used output from workers as analytics measurements in sebs
f7bb950
last changes necessary for sebs to run cloudflare. now just the stora…
1f0a979
javascript wrapper with polyfills reading from r2. r2 implementation,…
b117e75
adapted handler to measure invocation time
d42b157
fixed the fs polyfill to also support write operationst to r2 storage…
ffd3f78
added compatibility for benchmarks 100 in nodejs. translated all 100 …
272a372
current situation where asyncio cannot run the async function
556d799
dynamically add async to benchmark function *shrug*
93c8a73
nosql updates
e17982f
idea for cicrumvention of asyncio
MisterMM23 214c947
wrappers - run_sync for storage.py
b8f7c5c
nosql wrapper uses run_sync
dba2992
cloudflare nodejs wrapper without r2 as fs polyfill, just node_compat…
5390021
cleanup nodejs deployment cloudflare, no uploading files necessary an…
24497a2
add folder structure to python code package
8812708
nosql wrapper - duarble object - may work
5b3d784
fix python. 110 runs for me.
cd183b8
make it read the requirements.txt when it has a number
9379f39
durable objects compatibility for nodejs
5f9ad9c
asyncified the function calls...
92db5ae
fix python vendored modules
51892b0
added request polyfill for benchmark 120
3235d3f
fixed r2 usage for 120, 311
416b67b
support for cloudflare containers (python and nodejs), container work…
5284880
bigger container for python containers
b6de39b
sleep delay longer
812f592
request_id has to be string
9229f9f
update container fixed
5899d87
fixed benchmark wrapper request ids for experiment results as well as…
6e0cd2b
extract memory correctly
3cd741f
pyiodide does not support resource module for memory measurement
2615a36
timing fix for cloudflare handler
e69243a
fixed python timing issue
f39aad0
removed faulty nodejs implementations of 000 bmks
e76f846
removed unnecessary logging
dc2f6ed
removed experiments.json and package*.json
437cc97
updated cloudflare readme to reflect final changes
1eb375c
has platform check according to convention, durable object items remo…
6c0768e
updated readme to document the correct return object structure by the…
0dfcfa8
documented cold start tracking limitation
b2465f9
removed unreachable return statement in cloudflare.py
0eb4d0b
small fix to use public property
db84f2d
small fix for public field in durable objects
7e2d8ac
converted nosql client calls to async and removed the corresponding p…
35a556d
Fix instance variable naming in nosql_do class
ldzgch 92c5dea
Rename class instance reference from nosql to nosql_kv
ldzgch bcd5ecb
Apply suggestions from code review - storage.py
ldzgch 96ac2c1
Apply suggestions from code review
ldzgch b028151
config placeholder for api tokens, r2 etc
03e274e
variable base image in docker file... have to replace the right image…
a11236a
copy and execute init.sh file from benchmark directory, and execute i…
35755d6
add to existing markdown for cloudflare specific documentation instea…
b427c5b
more detail about download_metrics()
734eadf
Docker build container for build orchestration locally
5ffcb06
using docker client to build local image
4da0c31
do not create library trigger
4874794
removed some deprecated logging, throw exception if cold start is used
6b8e695
split the cloudflare.py into containers.py + workers.py... each handl…
865ca06
use toml writer, cleaner approach to write wrangler.toml
20eb8db
accidental capitalization of the table name resulting in errors in 130
ebe0794
warning that locationHint is not supported
9dd0a6e
s3 client for r2 storage code duplication removed. and also removed l…
b4f08a1
feat(cloudflare): Introduce KVStore for NoSQL storage and remove Dura…
fa7b2f5
refactor(nosql_kv): simplify index check by removing redundant condit…
1311f20
feat(cloudflare): Enhance HTTP invocation with User-Agent header and …
02cb35a
feat(cloudflare): Add support for benchmark validation based on langu…
60aa631
fix(cloudflare): Update deployment logic for Python workers and ensur…
cf9d333
refactor(cloudflare): abstract nodejs worker build into Dockerfile.wo…
cf5a547
fix(cloudflare): Implement health check pings to keep container warm …
91bb9a1
feat(cloudflare): Enhance language configuration with variant support…
8ea37c5
feat(cloudflare): Implement multipart upload support for R2 storage i…
a60e5d4
feat(cloudflare): Enable parallel downloads in download_directory met…
18070f0
fix(regression): Update benchmark filtering to use test_benchmark for…
044b9ef
fix(cloudflare): Enhance error handling for container provisioning in…
7ac2b8c
fix(regression): Update benchmark filtering logic to correctly extrac…
78b2979
fix(config): Correct cloudflare worker variant to use default configu…
6d7e4d0
fix(cloudflare): Remove hardcoded language-variant from Cloudflare te…
2cc8f93
fix(cloudflare): Improve handling of Cloudflare error code 1042 for w…
3f8e69c
fix(cloudflare): Enhance content type inference and improve upload ke…
c8ca384
fix(cloudflare): Update storage documentation to clarify container up…
b71e2c8
fix(cli): Add deployment type option to regression command for better…
53a1cd2
fix(cloudflare): Pin workers-py to version 1.8.0 to avoid broken impo…
32debbf
fix(cloudflare): Correct typo in typename and implement download meth…
e22bb62
fix(docs): Update comments to clarify Cloudflare Workers differences …
acf2e33
fix(cloudflare): Improve request handling and error responses in cont…
e4b2abf
fix(cloudflare): Enhance debugging by utilizing Node.js debuglog for …
28d90d7
fix(nosql): Update documentation and streamline query method by remov…
7517eaa
refactor(docker): Moved Dockerfiles for Node.js and Python functions
6eae47c
feat(docker): Add Dockerfile.build for Python worker validation and u…
6a3a8db
docs(storage): Clarify proxy usage for Cloudflare R2 and explain stor…
95fdaba
fix(storage): Enhance debugging by replacing console logs with debugl…
6b9434e
docs(cloudflare): Add deployment architecture and detailed flow for s…
8966909
refactor(handler): Move all imports to the top.
3665df8
refactor(storage): Remove content type inference from upload function…
2dafdf9
docs(build): Enhance documentation for build process and clarify Work…
eb21ce5
feat(handler): Implement advanceWorkersClock function to manage timin…
0be8706
docs(nosql): Add clarification on resource access in Cloudflare Workers
5908cef
docs(nosql): Update module description to clarify HTTP POST operation…
2040294
refactor(storage): Simplify list URL request by removing unnecessary …
6d8a1c6
refactor(handler): Remove unused functions and clean up code in handl…
6a5ef6b
docs(worker): Update comments to clarify the purpose and usage of wor…
9ed8f9c
docs(benchmark): Enhance comments to clarify handling of worker.js an…
26ea601
refactor(cli): Update deployment type options in regression command
f14bce9
feat(dependencies): Add patch-ng and tomli dependencies for compatibi…
5a1fdcc
feat(cloudflare): changes in workers.py including moving the template…
bcd3dae
refactor(regression): streamline Cloudflare benchmark configurations …
4f7a279
refactor(nosql): simplify constructor by removing unnecessary comment
8acec87
fix(docs): update Wrangler template paths to reflect new directory st…
812fe72
refactor(cli): streamline Dockerfile path resolution using get_resour…
d8f08db
refactor(containers): simplify Dockerfile path resolution using get_r…
97bb674
docs(cloudflare): enhance authentication section with detailed API to…
e69c836
refactor(containers): update Cloudflare CLI initialization and improv…
42eef0e
docs(kvstore): enhance KVStore class documentation with detailed name…
4a7f036
refactor(r2): improve error handling and logging in list_bucket method
cd1def7
refactor(containers): update Cloudflare CLI initialization to use con…
9b9894a
feat(cloudflare): enhance variant selection logic for Cloudflare depl…
6d94293
refactor(cloudflare): restructure Dockerfile.build for Node.js and Py…
31624ec
refactor(cloudflare): update base images for Python and Node.js to us…
0253844
refactor(cloudflare): enhance Dockerfile.build for Node.js and Python…
b84c330
refactor(cloudflare): simplify Docker image handling and improve sing…
b7ee302
refactor(cloudflare): update Docker socket mount comment for clarity …
8bc3595
refactor(cloudflare): enhance container image handling and add worker…
4574d8c
refactor(containers): get singleton cli instance, get template files …
220574c
refactor(config): update cloudflare variant structure and add require…
4a3fc61
refactor(cloudflare): enhance container image deployment by building …
33441b2
refactor(containers): streamline local container image build process …
ac51c3e
refactor(credentials): enhance documentation for Cloudflare API crede…
2ce61f3
refactor(r2): improve error handling in list_bucket method for S3 cli…
184695c
refactor(workers): update Cloudflare CLI initialization to use get_in…
38cd1c0
refactor(platforms): update build image descriptions and enhance buil…
360b2a2
refactor(cli): enhance Docker image handling and add containers_push …
51f1c3b
refactor(cloudflare): adjust wrangler.toml generation order to utiliz…
76d5d17
refactor(cloudflare): simplify worker URL generation and update loggi…
4af9a0f
fix(cloudflare): raise error for missing workers.dev subdomain instea…
d6baba3
fix(cli): ensure @cloudflare/container npm dependency is installed be…
cec0f9d
fix(containers): update image tag generation to use timestamp for ver…
89270cf
refactor(cloudflare): enhance container deployment flow with local im…
cfb935c
merged new commits into fork
ff45c7a
Merge branch 'spcl-master'
fad8b40
refactor: formatted using black
fbe8185
fix(handler): update return structure to include result object
bf1b0ac
fix(handler): update log_data structure to use 'result' key instead o…
584a3f8
fix(build): extend node built-ins filter to include 'constants' and h…
bd2111d
fix(workers): streamline package code logic and ensure proper handlin…
c476ac8
fix(storage): simplify data handling in aupload_stream by removing ba…
b9eb11f
fix(benchmarks): update subproject commit reference to latest version
e34c67d
fix(cli): add 'cloudflare' option to deployment platforms in CLI comm…
cd5992b
fix(cloudflare): enhance CLI container management for thread safety a…
6500fd2
fix(cloudflare): add manage image to systems.json
16e7454
fix(docs): clarify terminology for Cloudflare Workers in platforms.md
d3e6568
fix(docs): add wall-clock timing explanation for Cloudflare Workers i…
9681b6e
refactor(cloudflare): remove container warm-up logic from Cloudflare …
4d2db1f
refactor: linting with flake8
eda4d0f
refactor(cloudflare): update type hints and improve error handling in…
305c4fb
feat(storage): add downloadDirectory method to facilitate directory d…
1e402e4
black reformat
9340696
refactor(cloudflare): enhance function cache handling and redeploymen…
867f6b7
refactor(cloudflare): streamline logging message for worker redeployment
c076d79
refactor(mypy): add missing imports configuration for docker module
7c8867b
refactor(cloudflare): update docker client type hints for consistency
617e7d2
refactor(triggers): increase provisioning retries and wait time for H…
88e232b
refactor(cli): update Docker image tag handling to include versioning
606e1ac
black
c123da6
refactor(cli): add assertion for output type in command execution
c9c5372
refactor(cloudflare): enhance docstrings for clarity and consistency …
2053171
refactor(health-check): enhance comments for clarity on health check …
50ca4d6
Merge branch 'master' of upstream — integrate GCP gen2 support
a083ef3
refactor(cloudflare): replace container_deployment with system_varian…
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,6 +1,6 @@ | ||
| { | ||
| "timeout": 120, | ||
| "memory": 128, | ||
| "languages": ["python", "nodejs"], | ||
| "languages": ["python"], | ||
| "modules": [] | ||
| } |
56 changes: 56 additions & 0 deletions
56
benchmarks/100.webapps/120.uploader/python/function_cloudflare.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,56 @@ | ||
|
|
||
| import datetime | ||
| import os | ||
|
|
||
| from pyodide.ffi import run_sync | ||
| from pyodide.http import pyfetch | ||
|
|
||
| from . import storage | ||
| client = storage.storage.get_instance() | ||
|
|
||
| SEBS_USER_AGENT = "SeBS/1.2 (https://github.com/spcl/serverless-benchmarks) SeBS Benchmark Suite/1.2" | ||
|
|
||
| async def do_request(url, download_path): | ||
| headers = {'User-Agent': SEBS_USER_AGENT} | ||
|
|
||
| res = await pyfetch(url, headers=headers) | ||
| bs = await res.bytes() | ||
|
|
||
| with open(download_path, 'wb') as f: | ||
| f.write(bs) | ||
|
|
||
| def handler(event): | ||
|
|
||
| bucket = event.get('bucket').get('bucket') | ||
| output_prefix = event.get('bucket').get('output') | ||
| url = event.get('object').get('url') | ||
| name = os.path.basename(url) | ||
| download_path = '/tmp/{}'.format(name) | ||
|
|
||
| process_begin = datetime.datetime.now() | ||
|
|
||
| run_sync(do_request(url, download_path)) | ||
|
|
||
| size = os.path.getsize(download_path) | ||
| process_end = datetime.datetime.now() | ||
|
|
||
| upload_begin = datetime.datetime.now() | ||
| key_name = client.upload(bucket, os.path.join(output_prefix, name), download_path) | ||
| upload_end = datetime.datetime.now() | ||
|
|
||
| process_time = (process_end - process_begin) / datetime.timedelta(microseconds=1) | ||
| upload_time = (upload_end - upload_begin) / datetime.timedelta(microseconds=1) | ||
| return { | ||
| 'result': { | ||
| 'bucket': bucket, | ||
| 'url': url, | ||
| 'key': key_name | ||
| }, | ||
| 'measurement': { | ||
| 'download_time': 0, | ||
| 'download_size': 0, | ||
| 'upload_time': upload_time, | ||
| 'upload_size': size, | ||
| 'compute_time': process_time | ||
| } | ||
| } |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,78 @@ | ||
| const nosql = require('./nosql'); | ||
|
|
||
| const nosqlClient = nosql.nosql.get_instance(); | ||
| const nosqlTableName = "shopping_cart"; | ||
|
|
||
| async function addProduct(cartId, productId, productName, price, quantity) { | ||
| await nosqlClient.insert( | ||
| nosqlTableName, | ||
| ["cart_id", cartId], | ||
| ["product_id", productId], | ||
| { price: price, quantity: quantity, name: productName } | ||
| ); | ||
| } | ||
|
|
||
| async function getProducts(cartId, productId) { | ||
| return await nosqlClient.get( | ||
| nosqlTableName, | ||
| ["cart_id", cartId], | ||
| ["product_id", productId] | ||
| ); | ||
| } | ||
|
|
||
| async function queryProducts(cartId) { | ||
| const res = await nosqlClient.query( | ||
| nosqlTableName, | ||
| ["cart_id", cartId], | ||
| "product_id" | ||
| ); | ||
|
|
||
| const products = []; | ||
| let priceSum = 0; | ||
| let quantitySum = 0; | ||
|
|
||
| for (const product of res) { | ||
| products.push(product.name); | ||
| priceSum += product.price; | ||
| quantitySum += product.quantity; | ||
| } | ||
|
|
||
| const avgPrice = quantitySum > 0 ? priceSum / quantitySum : 0.0; | ||
|
|
||
| return { | ||
| products: products, | ||
| total_cost: priceSum, | ||
| avg_price: avgPrice | ||
| }; | ||
| } | ||
|
|
||
| exports.handler = async function(event) { | ||
| const results = []; | ||
|
|
||
| for (const request of event.requests) { | ||
| const route = request.route; | ||
| const body = request.body; | ||
| let res; | ||
|
|
||
| if (route === "PUT /cart") { | ||
| await addProduct( | ||
| body.cart, | ||
| body.product_id, | ||
| body.name, | ||
| body.price, | ||
| body.quantity | ||
| ); | ||
| res = {}; | ||
| } else if (route === "GET /cart/{id}") { | ||
| res = await getProducts(body.cart, request.path.id); | ||
| } else if (route === "GET /cart") { | ||
| res = await queryProducts(body.cart); | ||
| } else { | ||
| throw new Error(`Unknown request route: ${route}`); | ||
| } | ||
|
|
||
| results.push(res); | ||
| } | ||
|
|
||
| return { result: results }; | ||
| }; | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,9 @@ | ||
| { | ||
| "name": "crud-api", | ||
| "version": "1.0.0", | ||
| "description": "CRUD API benchmark", | ||
| "author": "", | ||
| "license": "", | ||
| "dependencies": { | ||
| } | ||
| } |
147 changes: 147 additions & 0 deletions
147
benchmarks/300.utilities/311.compression/nodejs/function.js
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,147 @@ | ||
| const fs = require('fs'); | ||
| const path = require('path'); | ||
| const zlib = require('zlib'); | ||
| const { v4: uuidv4 } = require('uuid'); | ||
| const storage = require('./storage'); | ||
|
|
||
| let storage_handler = new storage.storage(); | ||
|
|
||
| /** | ||
| * Calculate total size of a directory recursively | ||
| * @param {string} directory - Path to directory | ||
| * @returns {number} Total size in bytes | ||
| */ | ||
| function parseDirectory(directory) { | ||
| let size = 0; | ||
|
|
||
| function walkDir(dir) { | ||
| const files = fs.readdirSync(dir); | ||
| for (const file of files) { | ||
| const filepath = path.join(dir, file); | ||
| const stat = fs.statSync(filepath); | ||
| if (stat.isDirectory()) { | ||
| walkDir(filepath); | ||
| } else { | ||
| size += stat.size; | ||
| } | ||
| } | ||
| } | ||
|
|
||
| walkDir(directory); | ||
| return size; | ||
| } | ||
|
|
||
| /** | ||
| * Create a simple tar.gz archive from a directory using native zlib | ||
| * This creates a gzip-compressed tar archive without external dependencies | ||
| * @param {string} sourceDir - Directory to compress | ||
| * @param {string} outputPath - Path for the output archive file | ||
| * @returns {Promise<void>} | ||
| */ | ||
| async function createTarGzArchive(sourceDir, outputPath) { | ||
| // Create a simple tar-like format (concatenated files with headers) | ||
| const files = []; | ||
|
|
||
| function collectFiles(dir, baseDir = '') { | ||
| const entries = fs.readdirSync(dir); | ||
| for (const entry of entries) { | ||
| const fullPath = path.join(dir, entry); | ||
| const relativePath = path.join(baseDir, entry); | ||
| const stat = fs.statSync(fullPath); | ||
|
|
||
| if (stat.isDirectory()) { | ||
| collectFiles(fullPath, relativePath); | ||
| } else { | ||
| files.push({ | ||
| path: relativePath, | ||
| fullPath: fullPath, | ||
| size: stat.size | ||
| }); | ||
| } | ||
| } | ||
| } | ||
|
|
||
| collectFiles(sourceDir); | ||
|
|
||
| // Create a concatenated buffer of all files with simple headers | ||
| const chunks = []; | ||
| for (const file of files) { | ||
| const content = fs.readFileSync(file.fullPath); | ||
| // Simple header: filename length (4 bytes) + filename + content length (4 bytes) + content | ||
| const pathBuffer = Buffer.from(file.path); | ||
| const pathLengthBuffer = Buffer.allocUnsafe(4); | ||
| pathLengthBuffer.writeUInt32BE(pathBuffer.length, 0); | ||
| const contentLengthBuffer = Buffer.allocUnsafe(4); | ||
| contentLengthBuffer.writeUInt32BE(content.length, 0); | ||
|
|
||
| chunks.push(pathLengthBuffer); | ||
| chunks.push(pathBuffer); | ||
| chunks.push(contentLengthBuffer); | ||
| chunks.push(content); | ||
| } | ||
|
|
||
| const combined = Buffer.concat(chunks); | ||
|
|
||
| // Compress using gzip | ||
| const compressed = zlib.gzipSync(combined, { level: 9 }); | ||
| fs.writeFileSync(outputPath, compressed); | ||
| } | ||
|
|
||
| exports.handler = async function(event) { | ||
| const bucket = event.bucket.bucket; | ||
| const input_prefix = event.bucket.input; | ||
| const output_prefix = event.bucket.output; | ||
| const key = event.object.key; | ||
|
|
||
| // Create unique download path | ||
| const download_path = path.join('/tmp', `${key}-${uuidv4()}`); | ||
| fs.mkdirSync(download_path, { recursive: true }); | ||
|
|
||
| // Download directory from storage | ||
| const s3_download_begin = Date.now(); | ||
| await storage_handler.download_directory(bucket, path.join(input_prefix, key), download_path); | ||
| const s3_download_stop = Date.now(); | ||
|
|
||
| // Calculate size of downloaded files | ||
| const size = parseDirectory(download_path); | ||
|
|
||
| // Compress directory | ||
| const compress_begin = Date.now(); | ||
| const archive_name = `${key}.tar.gz`; | ||
| const archive_path = path.join(download_path, archive_name); | ||
| await createTarGzArchive(download_path, archive_path); | ||
| const compress_end = Date.now(); | ||
|
|
||
| // Get archive size | ||
| const archive_size = fs.statSync(archive_path).size; | ||
|
|
||
| // Upload compressed archive | ||
| const s3_upload_begin = Date.now(); | ||
| const [key_name, uploadPromise] = storage_handler.upload( | ||
| bucket, | ||
| path.join(output_prefix, archive_name), | ||
| archive_path | ||
| ); | ||
| await uploadPromise; | ||
| const s3_upload_stop = Date.now(); | ||
|
|
||
| // Calculate times in microseconds | ||
| const download_time = (s3_download_stop - s3_download_begin) * 1000; | ||
| const upload_time = (s3_upload_stop - s3_upload_begin) * 1000; | ||
| const process_time = (compress_end - compress_begin) * 1000; | ||
|
|
||
| return { | ||
| result: { | ||
| bucket: bucket, | ||
| key: key_name | ||
| }, | ||
| measurement: { | ||
| download_time: download_time, | ||
| download_size: size, | ||
| upload_time: upload_time, | ||
| upload_size: archive_size, | ||
| compute_time: process_time | ||
| } | ||
| }; | ||
| }; | ||
|
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,9 @@ | ||
| { | ||
| "name": "compression-benchmark", | ||
| "version": "1.0.0", | ||
| "description": "Compression benchmark for serverless platforms", | ||
| "main": "function.js", | ||
| "dependencies": { | ||
| "uuid": "^10.0.0" | ||
| } | ||
| } |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.