-
Notifications
You must be signed in to change notification settings - Fork 221
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sl/google test #1611
Sl/google test #1611
Conversation
@xinlipn Is this PR ready for review? If not, then can you please convert this PR into draft? Could you please also add some motivation to the PR description? This is important for sharing info with external collaborators who do not have access to AMD Jira. Thanks. |
Hey Artem, Thanks for the feedback. I have updated the comments with more background. |
Interesting feature. Can you please provide me with a link to the description of it? |
[Note] Please also note that CTest is able to test whatever the user wants, while IIRC GoogleTest is suitable for testing of C++ programs only. Therefore all the "custom" tests will have to be left under CTest. |
@pfultz2 Can you please glance at this PR? Thanks! |
@pfultz2 Thank you, Paul. |
Just curious, how does gtest handle passing parameters from CLI to tests? Its not feasible to create tests cases for every combination of convolution, so we can enumerate the ones that are important and then we should be able to pass flags to run the combinations that were not listed. That is how the current test driver works, but it does a cartesian product of all configurations so it creates a lot of test cases. From looking at the docs in gtest, it looks like we will still need argument parsing as gtest doesn't take care of that for us whereas catch2 lets us easily add our own command-line arguments: Whats the rational for choosing gtest over catch? gtest/catch will work well for our unit tests, but not so well for our acceptance tests. I am not sure either one can do what are current acceptance test drivers do though, but it looks like catch might get us closer there(we could at least take advantage of Clara).
The current test driver uses the same data for the CPU and GPU as well and it can save the CPU results too, but it is too large for all the test cases we run(which why it is not enabled by default). I did try to use bzip2 but it seems uncompressing was just about as slow(or even slower) for most cases as recomputing the CPU version(zstd might be a better choice for this). There are few cases where the same config produces different input values because we are calling the test differently and using To improve our current test drivers we need:
But using gtest/catch wont help fix any of these issues expect maybe forcing a refactoring of the tests(which is definitely needed). |
We can also use naive GPU implementations instead of CPU to compute reference data. This feature is implemented in the driver, but not in conv tests yet.
😉 |
Thanks everyone for reviewing and suggestions. |
@xinlipn If the idea behind re-using the same input and output data is based on fixture classes, then it requires serious redesign on existing tests. |
Thank you @atamazov and @pfultz2 for your time and input. Yes, TEST_F() offers the same data configurations for different tests. I pushed some changes as below:
|
So doing For example, in migraphx doing: TEST_CASE(expect)
{
int x = 1;
int y = 2;
EXPECT(x == y);
} It will print out:
Does gtest show something similiar?
Is the plan to just have our test cases be gtests that just invoke the test driver? So we just keep the test driver and remove the Enumerating the tests in C++(with gtest or catch) does provide some nice filtering for running different subsets of tests. However, there is a compile-time cost that can grow as we add more tests or change existing tests. A better option would be to have a json file with the different test configurations(its actually faster to parse json than to parse and compile C++). We would just need to provide our own filtering for this. |
Can you please elaborate the machinery you are going to use in the fixture classes to enable the "re-using the data" feature in tests? For example, what is the expected lifetime of the instances of fixture classes (and, consequently, what is expected lifetime of the re-usable data they own)? |
Here's a thread discussing passing CLI parameters to gtest |
I don't think so. With EXPECT_TRUE Macro, gtest spews diagnostic messages when the the condition is false, e.g. EXPECT_TRUE(false) << "diagnostic message"; |
…ing up-level directory when including header files
Accidentally changed status to closed. Sorry. |
…tly; clean up accidentally checked in gtest code from other branche
@junliume , thank you for the reminder. This is actually caused by the code accidently checked in. It has been reverted. Yet CI doesn't seem to be triggered. Please note the changes in Jenkinsfile is to force Docker to build a new container with update CMaker (3.11+). These changes should be reverted after the docker is created. |
This comment was marked as off-topic.
This comment was marked as off-topic.
@xinlipn CI has passed, please ping all reviewers for their opinions. Thanks! |
CI after retest has passed. Could @JehandadKhan @atamazov @pfultz2 take another look? Thank you |
@@ -23,7 +23,7 @@ | |||
# SOFTWARE. | |||
# | |||
################################################################################ | |||
cmake_minimum_required( VERSION 3.5 ) | |||
cmake_minimum_required( VERSION 3.11 ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ubuntu 18.04 has cmake 3.10. Would this potentially cause a problem if releasing team build from source? (I see in our own docker file we have installed 3.15, so where is 3.11 coming from?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @junliume , 3.11 is the minimal version that supports FetchContent. If Docker is used on Ubuntu 18.04, I believe the following changes in Dockerfile should cover the lower cMake version issue.
wget --no-check-certificate -qO - https://apt.kitware.com/keys/kitware-archive-latest.asc 2>/dev/null | apt-key add - &&
sh -c "echo deb https://apt.kitware.com/ubuntu/ bionic main | tee -a /etc/apt/sources.list" && \
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@xinlipn please reopen another PR since this one is breaking |
The project is to move tests from CTest to GoogleTest, which takes advantage of the same input data e.g. for testing on CPU and GPU, and saves the intermediate results for later comparison. This would avoid computing repetitively on the same data set and thus reduce testing time.