Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reenable high accuracy timers #285

Merged
merged 12 commits into from
Nov 25, 2014
Merged

Reenable high accuracy timers #285

merged 12 commits into from
Nov 25, 2014

Conversation

bosilca
Copy link
Member

@bosilca bosilca commented Nov 24, 2014

As discussed on the user mailing list the current implementation of MPI_Wtime defaults to gettimeofday in most cases. This patch proposes a better approach by:

  1. reenabling the use of micro-second accuracy OPAL timers
  2. Adding support for cycle level timers

@goodell please review. Based on our discussion at SC I made sure that when available we are using the RDTSC-based timers.

@bosilca
Copy link
Member Author

bosilca commented Nov 24, 2014

retest this please

@mellanox-github
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/60/
Test PASSed.

@mellanox-github
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/61/
Test PASSed.

}

#else

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code (in the context) that was in this #else block is buggy. rdtsc must be used with a serializing cpuid instruction to ensure that you are timing what you think you are timing. There's an even better pattern that should be used in general, though I'm not sure we can apply it to MPI_Wtime() because we don't know if we are being called at the beginning of a timing region or at the end. For more information see http://www.intel.com/content/www/us/en/intelligent-systems/embedded-systems-training/ia-32-ia-64-benchmark-code-execution-paper.html

@goodell
Copy link
Member

goodell commented Nov 24, 2014

George, I've finished my review, see my inline comments on the PR (not sure if they show up in email or not).

@mellanox-github
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/62/

Build Log
last 50 lines

[...truncated 6050 lines...]
  CC       class/opal_object.lo
  CC       class/opal_graph.lo
  CC       class/opal_atomic_lifo.lo
  CC       class/opal_value_array.lo
  CC       class/opal_pointer_array.lo
  CC       class/opal_ring_buffer.lo
  CC       class/opal_rb_tree.lo
  CC       class/ompi_free_list.lo
  CC       memoryhooks/memory.lo
  CC       runtime/opal_progress.lo
  CC       runtime/opal_finalize.lo
  CC       runtime/opal_init.lo
  CC       runtime/opal_params.lo
  CC       runtime/opal_cr.lo
  CC       runtime/opal_info_support.lo
  CC       runtime/opal_progress_threads.lo
  CC       threads/condition.lo
  CC       threads/mutex.lo
  CC       threads/tsd.lo
  CC       threads/thread.lo
  CC       dss/dss_internal_functions.lo
  CC       dss/dss_compare.lo
  CC       dss/dss_copy.lo
  CC       dss/dss_dump.lo
  CC       dss/dss_load_unload.lo
  CC       dss/dss_lookup.lo
  CC       dss/dss_pack.lo
  CC       dss/dss_peek.lo
  CC       dss/dss_print.lo
  CC       dss/dss_register.lo
  CC       dss/dss_unpack.lo
  CC       dss/dss_open_close.lo
runtime/opal_progress.c: In function 'opal_progress_set_event_poll_rate':
../opal/include/opal/sys/amd64/timer.h:46: error: can't find a register in class 'AREG' while reloading 'asm'
../opal/include/opal/sys/amd64/timer.h:46: error: 'asm' operand has impossible constraints
make[2]: *** [runtime/opal_progress.lo] Error 1
make[2]: *** Waiting for unfinished jobs....
runtime/opal_cr.c: In function 'opal_cr_set_time':
../opal/include/opal/sys/amd64/timer.h:46: error: can't find a register in class 'AREG' while reloading 'asm'
../opal/include/opal/sys/amd64/timer.h:46: error: 'asm' operand has impossible constraints
make[2]: *** [runtime/opal_cr.lo] Error 1
make[2]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/opal'
make[1]: *** [install-recursive] Error 1
make[1]: Leaving directory `/scrap/jenkins/jenkins/jobs/gh-ompi-master-pr/workspace/opal'
make: *** [install-recursive] Error 1
Build step 'Execute shell' marked build as failure
[BFA] Scanning build for known causes...

[BFA] Done. 0s

Test FAILed.

__asm__ __volatile__ ("cpuid\n\t"
"rdtsc\n\t"
: "=a" (a), "=d" (d)
:: "%rax", "%rbx", "%rcx", "%rdx");
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you want to drop the % characters in the clobber list (two places in this file, once in ia32/timer.h). Untested, but that's my recollection of the syntax.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The syntax seems to be correct. The problem is that I over-specified the
clobber list (rax and rdx) were already implied by the "=a" and "=d" one
line above.

George.

On Mon, Nov 24, 2014 at 3:29 PM, Dave Goodell notifications@github.com
wrote:

In opal/include/opal/sys/amd64/timer.h:

static inline opal_timer_t
opal_sys_timer_get_cycles(void)
{

- opal_timer_t ret;

- asm volatile("rdtsc" : "=A"(ret));

  • return ret;
    -}
  • unsigned a, d;
    
    +#if 0
  • **asm** **volatile** ("cpuid\n\t"
    
  •                       "rdtsc\n\t"
    
  •                       : "=a" (a), "=d" (d)
    
  •                       :: "%rax", "%rbx", "%rcx", "%rdx");
    

I think you want to drop the % characters in the clobber list (two places
in this file, once in ia32/timer.h). Untested, but that's my recollection
of the syntax.


Reply to this email directly or view it on GitHub
https://github.com/open-mpi/ompi/pull/285/files#r20823713.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I addressed all your concerns/suggestions. Thanks, they really make the
code cleaner and hopefully more robust. This is now ready to be merged.

On Mon, Nov 24, 2014 at 3:35 PM, George Bosilca bosilca@icl.utk.edu wrote:

The syntax seems to be correct. The problem is that I over-specified the
clobber list (rax and rdx) were already implied by the "=a" and "=d" one
line above.

George.

On Mon, Nov 24, 2014 at 3:29 PM, Dave Goodell notifications@github.com
wrote:

In opal/include/opal/sys/amd64/timer.h:

static inline opal_timer_t
opal_sys_timer_get_cycles(void)
{

- opal_timer_t ret;

- asm volatile("rdtsc" : "=A"(ret));

  • return ret;
    -}
  • unsigned a, d;
    
    +#if 0
  • **asm** **volatile** ("cpuid\n\t"
    
  •                       "rdtsc\n\t"
    
  •                       : "=a" (a), "=d" (d)
    
  •                       :: "%rax", "%rbx", "%rcx", "%rdx");
    

I think you want to drop the % characters in the clobber list (two
places in this file, once in ia32/timer.h). Untested, but that's my
recollection of the syntax.


Reply to this email directly or view it on GitHub
https://github.com/open-mpi/ompi/pull/285/files#r20823713.

@mellanox-github
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/63/

Build Log
last 50 lines

[...truncated 16275 lines...]
Hello, world, I am 1 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 5 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 0 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 2 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
+ timeout -s SIGKILL 3m mpirun -np 8 -bind-to core -mca btl_openib_if_include mlx5_0:1 -x MXM_RDMA_PORTS=mlx5_0:1 -mca pml ob1 -mca btl self,sm,openib /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/hello_c
Hello, world, I am 6 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 1 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 3 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 5 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 7 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 4 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 0 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 2 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
+ timeout -s SIGKILL 3m mpirun -np 8 -bind-to core -mca btl_openib_if_include mlx5_0:1 -x MXM_RDMA_PORTS=mlx5_0:1 -mca pml cm -mca mtl mxm /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/hello_c
Hello, world, I am 4 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 6 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 5 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 7 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 2 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 3 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 1 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 0 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
+ '[' 4 -gt 0 ']'
+ timeout -s SIGKILL 3m mpirun -np 8 -bind-to core -mca btl_openib_if_include mlx5_0:1 -x MXM_RDMA_PORTS=mlx5_0:1 -mca pml yalla /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/hello_c
Hello, world, I am 7 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 2 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 5 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 0 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 4 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 6 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 1 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
Hello, world, I am 3 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-356-gaca81d5, Unreleased developer copy, 137)
+ for exe in hello_c ring_c
+ exe_path=/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/ring_c
+ PATH=/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/bin:/hpc/local/bin::/usr/local/bin:/bin:/usr/bin:/usr/sbin:/hpc/local/bin:/hpc/local/bin/:/hpc/local/bin/:/sbin:/usr/sbin:/bin:/usr/bin:/usr/local/sbin:/opt/ibutils/bin
+ LD_LIBRARY_PATH=/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/lib:
+ mpi_runner 8 /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/ring_c
+ local np=8
+ local exe_path=/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/ring_c
+ local exe_args=
+ local 'common_mca=-bind-to core'
+ local 'mca=-bind-to core'
+ timeout -s SIGKILL 3m mpirun -np 8 -bind-to core -mca pml ob1 -mca btl self,tcp /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/ring_c
Process 0 sending 10 to 1, tag 201 (8 processes in ring)
./hpc_tests/jenkins/ompi/ompi_jenkins.sh: line 77: 21488 Killed                  $timeout_exe mpirun -np $np $mca -mca pml ob1 -mca btl self,tcp ${exe_path} ${exe_args}
Build step 'Execute shell' marked build as failure
[BFA] Scanning build for known causes...

[BFA] Done. 0s

Test FAILed.

@mellanox-github
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/64/

Build Log
last 50 lines

[...truncated 16276 lines...]
Hello, world, I am 1 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 0 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 2 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
+ timeout -s SIGKILL 3m mpirun -np 8 -bind-to core -mca btl_openib_if_include mlx5_0:1 -x MXM_RDMA_PORTS=mlx5_0:1 -mca pml ob1 -mca btl self,sm,openib /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/hello_c
Hello, world, I am 4 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 6 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 3 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 5 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 7 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 1 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 0 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 2 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
+ timeout -s SIGKILL 3m mpirun -np 8 -bind-to core -mca btl_openib_if_include mlx5_0:1 -x MXM_RDMA_PORTS=mlx5_0:1 -mca pml cm -mca mtl mxm /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/hello_c
Hello, world, I am 2 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 4 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 6 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 3 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 7 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 5 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 1 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 0 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
+ '[' 4 -gt 0 ']'
+ timeout -s SIGKILL 3m mpirun -np 8 -bind-to core -mca btl_openib_if_include mlx5_0:1 -x MXM_RDMA_PORTS=mlx5_0:1 -mca pml yalla /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/hello_c
Hello, world, I am 5 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 7 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 1 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 3 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 0 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 2 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 4 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
Hello, world, I am 6 of 8, (Open MPI v1.9a1, package: Open MPI jenkins@jenkins01 Distribution, ident: 1.9.0a1, repo rev: dev-357-ge41a11c, Unreleased developer copy, 137)
+ for exe in hello_c ring_c
+ exe_path=/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/ring_c
+ PATH=/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/bin:/hpc/local/bin::/usr/local/bin:/bin:/usr/bin:/usr/sbin:/hpc/local/bin:/hpc/local/bin/:/hpc/local/bin/:/sbin:/usr/sbin:/bin:/usr/bin:/usr/local/sbin:/opt/ibutils/bin
+ LD_LIBRARY_PATH=/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/lib:
+ mpi_runner 8 /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/ring_c
+ local np=8
+ local exe_path=/var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/ring_c
+ local exe_args=
+ local 'common_mca=-bind-to core'
+ local 'mca=-bind-to core'
+ timeout -s SIGKILL 3m mpirun -np 8 -bind-to core -mca pml ob1 -mca btl self,tcp /var/lib/jenkins/jobs/gh-ompi-master-pr/workspace/ompi_install1/examples/ring_c
Process 0 sending 10 to 1, tag 201 (8 processes in ring)
Process 0 sent to 1
./hpc_tests/jenkins/ompi/ompi_jenkins.sh: line 77:  6776 Killed                  $timeout_exe mpirun -np $np $mca -mca pml ob1 -mca btl self,tcp ${exe_path} ${exe_args}
Build step 'Execute shell' marked build as failure
[BFA] Scanning build for known causes...

[BFA] Done. 0s

Test FAILed.

@bosilca
Copy link
Member Author

bosilca commented Nov 24, 2014

retest this please.

@mellanox-github
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/65/
Test PASSed.

@mellanox-github
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
http://bgate.mellanox.com/jenkins/job/gh-ompi-master-pr/66/
Test PASSed.

@goodell
Copy link
Member

goodell commented Nov 25, 2014

The current version looks good to me too. Thanks for fixing this, George. I'm assuming you'll push it.

bosilca added a commit that referenced this pull request Nov 25, 2014
Reenable high accuracy timers
@bosilca bosilca merged commit 8cae899 into open-mpi:master Nov 25, 2014
jsquyres pushed a commit to jsquyres/ompi that referenced this pull request Sep 19, 2016
If the user specifies a --map-by <foo> option, then default to bind-t…
dong0321 pushed a commit to dong0321/ompi that referenced this pull request Feb 17, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants