This is the mail archive of the mauve-discuss@sources.redhat.com mailing list for the Mauve project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]

Recognizing expected failures in Mauve


Folks,
I'm proposing the following small change to the SimpleTestHarness which
would allow for the recognition of expected failures (and thus unexpected
passes) in a Mauve test run.

In the libgcj team's use of Mauve we've realized that it is difficult to
see regressions with the current setup, so I've made some changes that
read a file 'xfails' (if one exists) in the test run directory and use it
to compare against the test run.  Thus, expected failures are output as
XFAIL and unexpected failures are simply FAIL.  As a parallel, only
expected passes are listed as PASS; unexpected passes are output as
UNXPASS (IMO the use of XPASS here would be misleading).

Totals are output at the end of the run much like the current output of
failures.  Also, note that all this only happens if verbose is true.  Thus
the current behavior of just outputting FAILs is preserved as the default.
For those who don't want to see XFAILs or UNXPASSes even with verbose
true, just don't create or populate the xfails file and no change in
behavior will be observed.

This setup should not impact any postprocessing that might be done via a
greater test infrastructure (e.g. DejaGNU).

An xfails file can be trivially created by running without verbose (or
grep'ing for "^FAIL:" in the output of a test run) and making that the
xfails file.  The order of the entries in the xfails file has no impact on
the test reporting.  Clearly, the xfails file would be dependent on the
system under test so no "default" xfails file will exist in the Mauve cvs
hierarchy.  Use whatever mechanism you choose to put your xfails file in
your test run dir.

I've noticed a few tests (most notably some of the java.net tests that
have a separate server process) output some bogus PASS/FAIL info.  I plan
to clean up these tests so that the output is meaningful.

Anyway, here is the patch I'm proposing.  I will check it in in a day or
two, so please send feedback asap if there are issues.  Thanks!
--warrenl


Index: gnu/testlet/SimpleTestHarness.java
===================================================================
RCS file: /cvs/mauve/mauve/gnu/testlet/SimpleTestHarness.java,v
retrieving revision 1.26
diff -u -p -r1.26 SimpleTestHarness.java
--- SimpleTestHarness.java	1999/12/25 03:27:25	1.26
+++ SimpleTestHarness.java	2001/02/07 11:04:13
@@ -1,4 +1,4 @@
-// Copyright (c) 1998, 1999  Cygnus Solutions
+// Copyright (c) 1998, 1999, 2001  Red Hat, Inc.
 // Written by Tom Tromey <tromey@cygnus.com>
 
 // This file is part of Mauve.
@@ -25,6 +25,7 @@
 
 package gnu.testlet;
 import java.io.*;
+import java.util.Vector;
 
 public class SimpleTestHarness 
     extends TestHarness 
@@ -32,6 +33,9 @@ public class SimpleTestHarness 
 {
   private int count = 0;
   private int failures = 0;
+  private static Vector expected_xfails = new Vector ();
+  private int xfailures = 0;
+  private int unxpasses = 0;
   private int total = 0;
   private boolean verbose = false;
   private boolean debug = false;
@@ -49,12 +53,29 @@ public class SimpleTestHarness 
     {
       if (! result)
 	{
-	  System.out.println (getDescription ("FAIL"));
-	  ++failures;
+	  String desc;
+	  if (!expected_xfails.contains (desc = getDescription ("FAIL")))
+	    {
+	      System.out.println (desc);
+	      ++failures;
+	    }
+	  else if (verbose)
+	    {
+	      System.out.println ("X" + desc);
+	      ++xfailures;
+	    }
 	}
       else if (verbose)
 	{
-	  System.out.println (getDescription ("PASS"));
+	  if (expected_xfails.contains (getDescription ("FAIL")))
+	    {
+	      System.out.println (getDescription ("UNXPASS"));
+	      ++unxpasses;
+	    }
+	  else
+	    {
+	      System.out.println (getDescription ("PASS"));
+	    }
 	}
       ++count;
       ++total;
@@ -73,8 +94,8 @@ public class SimpleTestHarness 
   public Reader getResourceReader (String name) 
     throws ResourceNotFoundException
     {
-      return(new BufferedReader(new InputStreamReader(
-		getResourceStream(name))));
+      return (new BufferedReader (new InputStreamReader (
+		getResourceStream (name))));
     }
 
   public InputStream getResourceStream (String name) 
@@ -88,8 +109,8 @@ public class SimpleTestHarness 
 	{
 	  return 
 	    new FileInputStream (getSourceDirectory () 
-				+ File.separator 
-				+ realName );
+				+ File.separator
+				+ realName);
 	}
       catch (FileNotFoundException ex)
 	{
@@ -205,6 +226,10 @@ public class SimpleTestHarness 
   protected int done ()
     {
       System.out.println(failures + " of " + total + " tests failed");
+      if (unxpasses > 0)
+        System.out.println(unxpasses + " of " + total + " tests unexpectedly passed");
+      if (xfailures > 0)
+        System.out.println(xfailures + " of " + total + " tests expectedly failed");
       return failures > 0 ? 1 : 0;
     }
 
@@ -232,6 +257,22 @@ public class SimpleTestHarness 
 
       SimpleTestHarness harness
 	= new SimpleTestHarness (verbose, debug);
+
+      try
+        {
+          BufferedReader xfile = new BufferedReader (new FileReader ("xfails"));
+	  String str;
+          while ((str = xfile.readLine ()) != null)
+            expected_xfails.addElement (str);
+        }
+      catch (FileNotFoundException ex)
+        {
+          // Nothing.
+        }
+      catch (IOException ex)
+        {
+          // Nothing.
+        }
 
       BufferedReader r
 	= new BufferedReader (new InputStreamReader (System.in));



Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]