## WeBWorK Problems

### Question behaving differently for student and instructor despite same seed

by Alex Jordan -
Number of replies: 7
The PG problem below is being used in a course on a develop branch server (both pg and webwork2 are develop). A student had seed 178. None of the multiple choice answers are counted as correct. A screen shot of her submitting an answer is here.

When the instructor went to try it out, acting as that student, they checked the answer, and it is counted correct, as seen in this screen shot.

I took it one level further and changed the seed for myself for the problem to be the same as the student has. Then I was able to submit an answer for myself, and it again is counted correct, as seen in this screen shot.

Then I took it even further, and temporarily made the student have professor level privileges. This didn't matter---the correct answer was again blocked for her.

Is there any insight into how a problem can behave differently for different users, even with the same seed? This is not the first time something like this has happened in the past year or so, even going back to when we were on the master 2.8 release.

# WeBWorK problem written by Carl Yao
# Portland Community College
#
# Compare two fractions with the same denominator.
#
# Last edited: Yao 9/26/2013
#
# ENDDESCRIPTION

## DBsubject('Algebra')
## DBchapter('Basic Algebra')
## DBsection('Algebraic Expressions')
## KEYWORDS('compare','fraction')
## DBCCSS('6.NS.7.a')
## TitleText1('')
## EditionText1('')
## AuthorText1('')
## Section1('')
## Problem1('')
## Author('Carl Yao')
## Institution('PCC')

#This command starts the problem.
DOCUMENT();

"PGstandard.pl", #Always needed
"MathObjects.pl", #Almost always needed
"PGML.pl", #Almost always needed
"contextFraction.pl", #needed to have the Fraction Math Object
"parserPopUp.pl",
"PGcourse.pl",
);

########Begin Problem Setup############

TEXT(beginproblem());
Context("Fraction");

$den = random(5,20,1); do {$num1 = random(1,$den,1);} until (gcd($num1,$den)==1); do {$num2 = random(1,$den,1);} until ((gcd($num2,$den)==1) && ($num2!=$num1));$frac1 = Fraction($num1,$den);
$frac2 = Fraction($num2, $den); Context()->strings->add('<'=>{},'>'=>{},'='=>{}); if($frac1 < $frac2) {$answer = String('<');
$popup = PopUp(["?",$LTS, $GTS, "="],$LTS);
}
elsif($frac1 >$frac2)
{
$answer = String('>');$popup = PopUp(["?", $LTS,$GTS, "="], $GTS); } else {$answer = String('=');
$popup = PopUp(["?",$LTS, $GTS, "="], "="); }$answer='{}'.$answer.'{}'; ########Begin What the Student Sees############ BEGIN_PGML Choose [[$LTS]], [[$GTS]], or [=] to make a true statement. [ [$frac1] ] [@$popup->menu()@]* [ [$frac2] ]

END_PGML

########Begin more complicated answer processing (if needed)############

ANS( $popup->cmp(correct_ans_latex_string => "$answer") );

########Begin solution.############

BEGIN_PGML_SOLUTION

These two fractions have the same denominator, so the fraction with the bigger numerator is bigger.

$scalar2 =$num2/$den; if($scalar1 > $scalar2) {} etc Sorry that I cannot provide a more eloquent suggestion. hp In reply to Hedley Pinsent ### Re: Question behaving differently for student and instructor despite same seed by Alex Jordan - I think that you can use comparison operators on Fraction objects. I'm not sure that I'm raising an issue that has anything to do with the code used in this problem (although maybe it does). I'm experiencing a situation where for some reason, a certain submitted answer elicits a different response for one particular student than the same submitted answer elicits for anyone else, even when the same seed is used. This is never supposed to happen. It suggests that either randomization is being used at some stage that does not respect the seed, or that something else in the PG answer checking process is distinguishing this user from other users. When this student hits submit, is it possible that something could be corrupted in the database sections associated to this student, and that corruption leads to the error message she gets? In reply to Alex Jordan ### Re: Question behaving differently for student and instructor despite same seed by Danny Glin - There are a couple of things that come to mind in PG that may differ despite the seed being the same: PG is aware of the number of attempts used by the student, and also the past answer of the student. I'm not sure how either of these would cause this issue, but it's somewhere to look. In reply to Alex Jordan ### Re: Question behaving differently for student and instructor despite same seed by Michael Gage - I notice a difference in the Answer Preview between the instructor screen shot and the student screen shot. Is one of them using MathJax and the other using Images for rendering? That should not make a difference, but there may be a bug. In reply to Michael Gage ### Re: Question behaving differently for student and instructor despite same seed by Alex Jordan - Both are using MathJax. However when the student submits the answer, whatever issue is causing the answer to be counted incorrect is also causing MathJax to not act on the TeX input for the submitted answer properly. The code for the problem is surrounding the answer string "<" with bracket pairs before passing it forward in the answer checker as correct_ans_latex_string. I can only imagine this was a hack to address some issue that arose during the original coding of the problem two years ago. When I get a chance, I should go back and experiment with doing that differently now that perhaps some things in PG have changed since the original coding of the problem. It's just awkward to do that testing, because to experience the issue, I actually have to log on as the student. And of course, even if making a change like that makes the issue go away, it still won't explain to me why the problem behaves differently for different users even when the seed is the same. In reply to Alex Jordan ### Re: Question behaving differently for student and instructor despite same seed by Davide Cervone - I don't think the issue has anything to do with randomization. The PopUp object is a subclass of String, and the String object doesn't use any randomization in its checking (only Formulas do). I suspect it has something to do with the HTML special-character escaping. Mike points out that the student's answers are not being displayed properly in the results table, which I noticed as well. Also, the pop-up menu doesn't show the students selection, but instead shows the initial question mark. This suggests that the student's answer as it is submitted isn't matching one of the answers in the PopUp, and that means the values stored in the PopUp's HTML representation might be differing from the strings in the PopUp itself. The problem uses $LTS, $GTS, so the internal strings in the PopUp might be things like "<", and when these are inserted into the HTML code for the menu, you get things like <option value="&lt;">&lt;</option>, which sends an actual < back as the value rather than the &lt; that the PopUp needs. You could check this by viewing the page source (the actual page source, not the DOM inspector), when logged in as the student to see what the contents of the <option> elements attributes look like. I'm wondering if changing $LTS and \$GTS back to just < and > might not fix it.

Also, what version of the parserPopUp.pl are you using (there were changes a few months ago, so have you updated since then)? It may be that the PopUp object isn't properly escaping its values, and that is contributing to the problem.

It might be a browser-dependency rather than a difference in student versus professor. Are you able to reproduce the error on the same browser that works for you?