Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add workflow tests for benchmarks #362

Draft
wants to merge 17 commits into
base: main
Choose a base branch
from
Draft

Conversation

scotts
Copy link
Contributor

@scotts scotts commented Nov 12, 2024

We're not currently testing our benchmarks. This is a problem, because we tend to only run the benchmarks in phases when we're digging into performance. This means it's easy for our benchmarks to be silently broken when we make code changes. This PR adds a benchmark testing workflow, and makes generate_readme_data.py easy to run in a test mode.

We'll want to expand this workflow to test all benchmarks we care about. This means we'll probably need to move it to GPU instances, but we can iterate on that.

PR Status
The benchmark test passes, but what I needed to do to get it to pass has surprised me:

  1. We're installing PyTorch, TorchAudio and TorchVision through conda, not pip. Installing TorchVision through pip encounters the error where it says that the video_reader backend is not available. Installing from source following the official instructions causes a segfault in the benchmark. I admit I did not investigate that much, and eventually tried using conda instead of pip because that's how I happened to have set up my own dev environment, and I don't believe I installed TorchVision from source.
  2. I have had to specify ffmpeg=4 in order for TorchAudio to be able to find the ffmpeg libraries at runtime.

I suspect, but have not yet confirmed, that part of the problem is caused by Decord bundling the FFmpeg binaries in their library. I will test this by running the benchmark without Decord next.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Nov 12, 2024
@scotts scotts force-pushed the benchmark_test branch 7 times, most recently from 98f39c4 to df8ee9b Compare November 12, 2024 14:50
@@ -13,7 +13,7 @@

from benchmark_decoders_library import (
AbstractDecoder,
BatchParameters,
# BatchParameters,
Copy link
Contributor Author

@scotts scotts Nov 13, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ignore changes in this file; once we fix these failures on trunk, I'll pull those changes. These changes just let this run without error.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants