Skip to content
GitLab
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in / Register
  • M metaseq
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 95
    • Issues 95
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 41
    • Merge requests 41
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Packages and registries
    • Packages and registries
    • Package Registry
    • Infrastructure Registry
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Repository
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • Administrator
  • metaseq
  • Merge requests
  • !518

Add a script to benchmark generator

  • Review changes

  • Download
  • Email patches
  • Plain diff
Open Administrator requested to merge benchmark-generator into main Nov 15, 2022
  • Overview 10
  • Commits 1
  • Pipelines 0
  • Changes 1

Created by: tangbinh

Summary

Add a new script to collect latency results for OPT models during generation. While this script resembles the existing one metaseq/scripts/generation_benchmarks.py, it's a bit more general where besides latency, we also collect memory usage and GPU traces for various configurations of batch size, input length, and output length. We also use the GeneratorInterface directly and skip the checkpoint downloading part.

Assignee
Assign to
Reviewers
Request review from
Time tracking
Source branch: benchmark-generator