Linux kernel mirror (for testing) git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git
kernel os linux

ftracetest: Add POSIX.3 standard and XFAIL result codes

Add XFAIL and POSIX 1003.3 standard codes (UNRESOLVED/
UNTESTED/UNSUPPORTED) as result codes. These are used for the
results that test case is expected to fail or unsupported
feature (by config).

To return these result code, this introduces exit_unresolved,
exit_untested, exit_unsupported and exit_xfail functions,
which use real-time signals to notify the result code to
ftracetest.

This also set "errexit" option for the testcases, so that
the tests don't need to exit explicitly.

Note that if the test returns UNRESOLVED/UNSUPPORTED/FAIL,
its test log including executed commands is shown on console
and main logfile as below.

------
# ./ftracetest samples/
=== Ftrace unit tests ===
[1] failure-case example [FAIL]
execute: /home/fedora/ksrc/linux-3/tools/testing/selftests/ftrace/samples/fail.tc
+ . /home/fedora/ksrc/linux-3/tools/testing/selftests/ftrace/samples/fail.tc
++ cat non-exist-file
cat: non-exist-file: No such file or directory
[2] pass-case example [PASS]
[3] unresolved-case example [UNRESOLVED]
execute: /home/fedora/ksrc/linux-3/tools/testing/selftests/ftrace/samples/unresolved.tc
+ . /home/fedora/ksrc/linux-3/tools/testing/selftests/ftrace/samples/unresolved.tc
++ trap exit_unresolved INT
++ kill -INT 29324
+++ exit_unresolved
+++ kill -s 38 29265
+++ exit 0
[4] unsupported-case example [UNSUPPORTED]
execute: /home/fedora/ksrc/linux-3/tools/testing/selftests/ftrace/samples/unsupported.tc
+ . /home/fedora/ksrc/linux-3/tools/testing/selftests/ftrace/samples/unsupported.tc
++ exit_unsupported
++ kill -s 40 29265
++ exit 0
[5] untested-case example [UNTESTED]
[6] xfail-case example [XFAIL]

# of passed: 1
# of failed: 1
# of unresolved: 1
# of untested: 1
# of unsupported: 1
# of xfailed: 1
# of undefined(test bug): 0
------

Link: http://lkml.kernel.org/p/20140929120211.30203.99510.stgit@kbuild-f20.novalocal

Acked-by: Namhyung Kim <namhyung@kernel.org>
Signed-off-by: Masami Hiramatsu <masami.hiramatsu.pt@hitachi.com>
Signed-off-by: Steven Rostedt <rostedt@goodmis.org>

authored by

Masami Hiramatsu and committed by
Steven Rostedt
915de2ad 2909ef28

+189 -33
+37
tools/testing/selftests/ftrace/README
··· 38 38 * The test cases should run on dash (busybox shell) for testing on 39 39 minimal cross-build environments. 40 40 41 + * Note that the tests are run with "set -e" (errexit) option. If any 42 + command fails, the test will be terminated immediately. 43 + 44 + * The tests can return some result codes instead of pass or fail by 45 + using exit_unresolved, exit_untested, exit_unsupported and exit_xfail. 46 + 47 + Result code 48 + =========== 49 + 50 + Ftracetest supports following result codes. 51 + 52 + * PASS: The test succeeded as expected. The test which exits with 0 is 53 + counted as passed test. 54 + 55 + * FAIL: The test failed, but was expected to succeed. The test which exits 56 + with !0 is counted as failed test. 57 + 58 + * UNRESOLVED: The test produced unclear or intermidiate results. 59 + for example, the test was interrupted 60 + or the test depends on a previous test, which failed. 61 + or the test was set up incorrectly 62 + The test which is in above situation, must call exit_unresolved. 63 + 64 + * UNTESTED: The test was not run, currently just a placeholder. 65 + In this case, the test must call exit_untested. 66 + 67 + * UNSUPPORTED: The test failed because of lack of feature. 68 + In this case, the test must call exit_unsupported. 69 + 70 + * XFAIL: The test failed, and was expected to fail. 71 + To return XFAIL, call exit_xfail from the test. 72 + 73 + There are some sample test scripts for result code under samples/. 74 + You can also run samples as below: 75 + 76 + # ./ftracetest samples/ 77 + 41 78 TODO 42 79 ==== 43 80
+109 -15
tools/testing/selftests/ftrace/ftracetest
··· 114 114 115 115 116 116 # Testcase management 117 + # Test result codes - Dejagnu extended code 118 + PASS=0 # The test succeeded. 119 + FAIL=1 # The test failed, but was expected to succeed. 120 + UNRESOLVED=2 # The test produced indeterminate results. (e.g. interrupted) 121 + UNTESTED=3 # The test was not run, currently just a placeholder. 122 + UNSUPPORTED=4 # The test failed because of lack of feature. 123 + XFAIL=5 # The test failed, and was expected to fail. 124 + 125 + # Accumulations 117 126 PASSED_CASES= 118 127 FAILED_CASES= 128 + UNRESOLVED_CASES= 129 + UNTESTED_CASES= 130 + UNSUPPORTED_CASES= 131 + XFAILED_CASES= 132 + UNDEFINED_CASES= 133 + TOTAL_RESULT=0 134 + 119 135 CASENO=0 120 136 testcase() { # testfile 121 137 CASENO=$((CASENO+1)) 122 138 prlog -n "[$CASENO]"`grep "^#[ \t]*description:" $1 | cut -f2 -d:` 123 139 } 124 - failed() { 125 - prlog " [FAIL]" 126 - FAILED_CASES="$FAILED_CASES $CASENO" 127 - } 128 - passed() { 129 - prlog " [PASS]" 130 - PASSED_CASES="$PASSED_CASES $CASENO" 140 + 141 + eval_result() { # retval sigval 142 + local retval=$2 143 + if [ $2 -eq 0 ]; then 144 + test $1 -ne 0 && retval=$FAIL 145 + fi 146 + case $retval in 147 + $PASS) 148 + prlog " [PASS]" 149 + PASSED_CASES="$PASSED_CASES $CASENO" 150 + return 0 151 + ;; 152 + $FAIL) 153 + prlog " [FAIL]" 154 + FAILED_CASES="$FAILED_CASES $CASENO" 155 + return 1 # this is a bug. 156 + ;; 157 + $UNRESOLVED) 158 + prlog " [UNRESOLVED]" 159 + UNRESOLVED_CASES="$UNRESOLVED_CASES $CASENO" 160 + return 1 # this is a kind of bug.. something happened. 161 + ;; 162 + $UNTESTED) 163 + prlog " [UNTESTED]" 164 + UNTESTED_CASES="$UNTESTED_CASES $CASENO" 165 + return 0 166 + ;; 167 + $UNSUPPORTED) 168 + prlog " [UNSUPPORTED]" 169 + UNSUPPORTED_CASES="$UNSUPPORTED_CASES $CASENO" 170 + return 1 # this is not a bug, but the result should be reported. 171 + ;; 172 + $XFAIL) 173 + prlog " [XFAIL]" 174 + XFAILED_CASES="$XFAILED_CASES $CASENO" 175 + return 0 176 + ;; 177 + *) 178 + prlog " [UNDEFINED]" 179 + UNDEFINED_CASES="$UNDEFINED_CASES $CASENO" 180 + return 1 # this must be a test bug 181 + ;; 182 + esac 131 183 } 132 184 185 + # Signal handling for result codes 186 + SIG_RESULT= 187 + SIG_BASE=36 # Use realtime signals 188 + SIG_PID=$$ 189 + 190 + SIG_UNRESOLVED=$((SIG_BASE + UNRESOLVED)) 191 + exit_unresolved () { 192 + kill -s $SIG_UNRESOLVED $SIG_PID 193 + exit 0 194 + } 195 + trap 'SIG_RESULT=$UNRESOLVED' $SIG_UNRESOLVED 196 + 197 + SIG_UNTESTED=$((SIG_BASE + UNTESTED)) 198 + exit_untested () { 199 + kill -s $SIG_UNTESTED $SIG_PID 200 + exit 0 201 + } 202 + trap 'SIG_RESULT=$UNTESTED' $SIG_UNTESTED 203 + 204 + SIG_UNSUPPORTED=$((SIG_BASE + UNSUPPORTED)) 205 + exit_unsupported () { 206 + kill -s $SIG_UNSUPPORTED $SIG_PID 207 + exit 0 208 + } 209 + trap 'SIG_RESULT=$UNSUPPORTED' $SIG_UNSUPPORTED 210 + 211 + SIG_XFAIL=$((SIG_BASE + XFAIL)) 212 + exit_xfail () { 213 + kill -s $SIG_XFAIL $SIG_PID 214 + exit 0 215 + } 216 + trap 'SIG_RESULT=$XFAIL' $SIG_XFAIL 133 217 134 218 # Run one test case 135 219 run_test() { # testfile ··· 221 137 local testlog=`mktemp --tmpdir=$LOG_DIR ${testname}-XXXXXX.log` 222 138 testcase $1 223 139 echo "execute: "$1 > $testlog 224 - (cd $TRACING_DIR; set -x ; . $1) >> $testlog 2>&1 225 - ret=$? 226 - if [ $ret -ne 0 ]; then 227 - failed 228 - catlog $testlog 229 - else 230 - passed 140 + SIG_RESULT=0 141 + # setup PID and PPID, $$ is not updated. 142 + (cd $TRACING_DIR; read PID _ < /proc/self/stat ; 143 + set -e; set -x; . $1) >> $testlog 2>&1 144 + eval_result $? $SIG_RESULT 145 + if [ $? -eq 0 ]; then 146 + # Remove test log if the test was done as it was expected. 231 147 [ $KEEP_LOG -eq 0 ] && rm $testlog 148 + else 149 + catlog $testlog 150 + TOTAL_RESULT=1 232 151 fi 233 152 } 234 153 ··· 239 152 for t in $TEST_CASES; do 240 153 run_test $t 241 154 done 155 + 242 156 prlog "" 243 157 prlog "# of passed: " `echo $PASSED_CASES | wc -w` 244 158 prlog "# of failed: " `echo $FAILED_CASES | wc -w` 159 + prlog "# of unresolved: " `echo $UNRESOLVED_CASES | wc -w` 160 + prlog "# of untested: " `echo $UNTESTED_CASES | wc -w` 161 + prlog "# of unsupported: " `echo $UNSUPPORTED_CASES | wc -w` 162 + prlog "# of xfailed: " `echo $XFAILED_CASES | wc -w` 163 + prlog "# of undefined(test bug): " `echo $UNDEFINED_CASES | wc -w` 245 164 246 - test -z "$FAILED_CASES" # if no error, return 0 165 + # if no error, return 0 166 + exit $TOTAL_RESULT
+4
tools/testing/selftests/ftrace/samples/fail.tc
··· 1 + #!/bin/sh 2 + # description: failure-case example 3 + cat non-exist-file 4 + echo "this is not executed"
+3
tools/testing/selftests/ftrace/samples/pass.tc
··· 1 + #!/bin/sh 2 + # description: pass-case example 3 + return 0
+4
tools/testing/selftests/ftrace/samples/unresolved.tc
··· 1 + #!/bin/sh 2 + # description: unresolved-case example 3 + trap exit_unresolved INT 4 + kill -INT $PID
+3
tools/testing/selftests/ftrace/samples/unsupported.tc
··· 1 + #!/bin/sh 2 + # description: unsupported-case example 3 + exit_unsupported
+3
tools/testing/selftests/ftrace/samples/untested.tc
··· 1 + #!/bin/sh 2 + # description: untested-case example 3 + exit_untested
+3
tools/testing/selftests/ftrace/samples/xfail.tc
··· 1 + #!/bin/sh 2 + # description: xfail-case example 3 + cat non-exist-file || exit_xfail
+2 -1
tools/testing/selftests/ftrace/test.d/00basic/basic2.tc
··· 1 1 #!/bin/sh 2 2 # description: Basic test for tracers 3 + test -f available_tracers 3 4 for t in `cat available_tracers`; do 4 - echo $t > current_tracer || exit 1 5 + echo $t > current_tracer 5 6 done 6 7 echo nop > current_tracer
+3 -3
tools/testing/selftests/ftrace/test.d/00basic/basic3.tc
··· 1 1 #!/bin/sh 2 2 # description: Basic trace clock test 3 - [ -f trace_clock ] || exit 1 3 + test -f trace_clock 4 4 for c in `cat trace_clock | tr -d \[\]`; do 5 - echo $c > trace_clock || exit 1 6 - grep '\['$c'\]' trace_clock || exit 1 5 + echo $c > trace_clock 6 + grep '\['$c'\]' trace_clock 7 7 done 8 8 echo local > trace_clock
+6 -6
tools/testing/selftests/ftrace/test.d/kprobe/add_and_remove.tc
··· 1 1 #!/bin/sh 2 2 # description: Kprobe dynamic event - adding and removing 3 3 4 - [ -f kprobe_events ] || exit 1 4 + [ -f kprobe_events ] || exit_unsupported # this is configurable 5 5 6 - echo 0 > events/enable || exit 1 7 - echo > kprobe_events || exit 1 8 - echo p:myevent do_fork > kprobe_events || exit 1 9 - grep myevent kprobe_events || exit 1 10 - [ -d events/kprobes/myevent ] || exit 1 6 + echo 0 > events/enable 7 + echo > kprobe_events 8 + echo p:myevent do_fork > kprobe_events 9 + grep myevent kprobe_events 10 + test -d events/kprobes/myevent 11 11 echo > kprobe_events
+7 -8
tools/testing/selftests/ftrace/test.d/kprobe/busy_check.tc
··· 1 1 #!/bin/sh 2 2 # description: Kprobe dynamic event - busy event check 3 3 4 - [ -f kprobe_events ] || exit 1 4 + [ -f kprobe_events ] || exit_unsupported 5 5 6 - echo 0 > events/enable || exit 1 7 - echo > kprobe_events || exit 1 8 - echo p:myevent do_fork > kprobe_events || exit 1 9 - [ -d events/kprobes/myevent ] || exit 1 10 - echo 1 > events/kprobes/myevent/enable || exit 1 6 + echo 0 > events/enable 7 + echo > kprobe_events 8 + echo p:myevent do_fork > kprobe_events 9 + test -d events/kprobes/myevent 10 + echo 1 > events/kprobes/myevent/enable 11 11 echo > kprobe_events && exit 1 # this must fail 12 - echo 0 > events/kprobes/myevent/enable || exit 1 12 + echo 0 > events/kprobes/myevent/enable 13 13 echo > kprobe_events # this must succeed 14 -
+5
tools/testing/selftests/ftrace/test.d/template
··· 1 1 #!/bin/sh 2 2 # description: %HERE DESCRIBE WHAT THIS DOES% 3 3 # you have to add ".tc" extention for your testcase file 4 + # Note that all tests are run with "errexit" option. 5 + 4 6 exit 0 # Return 0 if the test is passed, otherwise return !0 7 + # If the test could not run because of lack of feature, call exit_unsupported 8 + # If the test returned unclear results, call exit_unresolved 9 + # If the test is a dummy, or a placeholder, call exit_untested