Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

api server: deploy endpoint: validate uploaded tarball #28

Open
adamsondavid opened this issue Nov 5, 2024 · 2 comments
Open

api server: deploy endpoint: validate uploaded tarball #28

adamsondavid opened this issue Nov 5, 2024 · 2 comments
Labels
enhancement New feature or request

Comments

@adamsondavid
Copy link
Owner

No description provided.

@adamsondavid adamsondavid changed the title runtime or deploy server: validate env vars of deployment tarball runtime or deploy server: validate env vars json file and structure of deployment tarball Nov 5, 2024
@adamsondavid adamsondavid changed the title runtime or deploy server: validate env vars json file and structure of deployment tarball deploy server: validate env vars json file and structure of deployment tarball Jan 28, 2025
@adamsondavid
Copy link
Owner Author

  • validate the overall size of the entire uploaded tarball
  • validate the file size of each file within the tarball to prevent against zip bomb
  • validate that the tarball does not contain symlinks
  • validate that the tarball does not contain files that would be placed at unexpected locations e.g. outside the unpack dir
  • if functions/env.json exists, validate that it a) is JSON parsable and b) matches the type Record<string, string> using zod
  • if functions exist, validate that their file size does not exceed a threshold (might already be covered with the second validation rule above)
  • if functions exist, validate that they do not contain a) dynamic imports and b) relative imports (not that important; therefore optional)

@adamsondavid adamsondavid changed the title deploy server: validate env vars json file and structure of deployment tarball api server: deploy endpoint: validate uploaded tarball Jan 30, 2025
@adamsondavid adamsondavid marked this as a duplicate of #19 Jan 30, 2025
@adamsondavid adamsondavid marked this as a duplicate of #20 Jan 30, 2025
@adamsondavid
Copy link
Owner Author

example code starting point:

const tar = require('tar');
const zlib = require('zlib');
const stream = require('stream');

async function copyTarGzInMemory(sourceStream, destinationStream, maxFileSize = 4 * 1024 * 1024, maxFiles = 1024) {
  const files = [];
  let fileCount = 0; // Counter for the number of files

  // Step 1: Extract files in memory
  const extract = new tar.Parse({
    onentry: (entry) => {
      // Increment file count and enforce file limit
      fileCount++;
      if (fileCount > maxFiles) {
        throw new Error(`Tarball contains more than the allowed number of files (${maxFiles}).`);
      }

      // Reject symbolic links
      if (entry.type === 'SymbolicLink') {
        throw new Error(`Symbolic links are not allowed: ${entry.path}`);
      }

      // Reject files with invalid paths
      if (entry.path.includes('..') || entry.path.startsWith('/')) {
        throw new Error(`Invalid file path detected: ${entry.path}`);
      }

      // Accumulate files for later repacking
      let fileSize = 0;
      const chunks = [];
      entry.on('data', (chunk) => {
        fileSize += chunk.length;
        if (fileSize > maxFileSize) {
          throw new Error(`File ${entry.path} exceeds maximum allowed size (${maxFileSize} bytes).`);
        }
        chunks.push(chunk);
      });

      entry.on('end', () => {
        files.push({ path: entry.path, content: Buffer.concat(chunks) });
      });
    },
  });

  await new Promise((resolve, reject) => {
    sourceStream
      .pipe(zlib.createGunzip()) // Decompress the tar.gz
      .pipe(extract) // Parse and extract files
      .on('error', reject)
      .on('finish', resolve);
  });

  // Step 2: Repack files into a new tar.gz
  const pack = new tar.Pack({ gzip: true, portable: true });
  const repackStream = stream.Readable.from(
    files.map((file) => ({
      path: file.path,
      content: file.content,
    }))
  ).pipe(pack);

  await new Promise((resolve, reject) => {
    repackStream.pipe(destinationStream).on('error', reject).on('finish', resolve);
  });
}

module.exports = copyTarGzInMemory;

@adamsondavid adamsondavid added the enhancement New feature or request label Jan 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant