Re: automatically grading small programming assignments
On Thu, 14 Dec 2006 21:36:31 -0500 Brian Blais [EMAIL PROTECTED] wrote: # Paddy wrote: # It might turn out to be a poor substitute for the personal touch, # especially If they are just starting to program. # # Oh, I didn't mean it to completely replace me grading things, but I # think it would be useful if there were a lot of little assignments # that could be done automatically, and then some larger ones that I # would grade by hand. The little ones could be set up so that they can # submit as many times as they want, until they get it right. Well, that sounds like a valid plan, but why would you want to grade the little ones at all, then? What I would most likely do is to publish those small assignments together with a set of tests for each one, and say that they should write programs and make sure their programs pass the tests. If you wish, you could publish two sets of tests, the easy and tricky ones, and have them use easy ones when writing program, and only run it through the tricky tests when they believe the program is bug-free. This can be a very valuable experience! (if you can devise the right tests, of course ;) If you either require the skills they develop doing small assignments in the big assignments, or if you check 2-3 small assignments by hand, you should be able to reduce cheating sufficiently... It's just a matter of making sure they really *do* write programs and that those programs *do* pass the tests. Or just require students to hand in the small assignments, together with the testing output, but do not check them at all (not too many will have the guts to fake the outputs). Then there is a whole range of ideas about peer review in the education community, where you could get students to verify one another's programs... But this can sometimes be tricky. -- Best wishes, Slawomir Nowaczyk ( [EMAIL PROTECTED] ) Someone Else's Code - a commonly used synonym for Bad Code -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
Jeff Rush wrote: For another solution, I wonder whether you could make use of the new Abstract Syntax Tree (AST) in Python 2.5, where you convert the source of an attempt into an abstract data structure, anonymize the method/variable/class names and compare the tree against a correct solution. It would let you quickly handle those students who solved it in a conformist way, and then you'd need to manually review the rest for creatively solving it another way. ;-) You could attempt that kind of solution using previous versions of Python (and the compiler module), but as soon as you want to compare two different ASTs - and I think that unless there's only one obvious solution, they won't be identical - then you need to descend into a world of program transformations that Python doesn't encourage. It's probably no coincidence that the functional programming people were probably the only people trying this kind of automatic grading back when I was a student. Paul -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
Hi Brian You could make great use of XML-RPC here. XML-RPC is /really/ easy to use. Here is a simple example: http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/81549 You put procedures on the server that will check the args against a the required result, and report back to the student whether it passes or fails. Here is another example using xml-rpc over https, for security: http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/496786 So, the idea is that the student calls a procedure on the xml-rpc server (which you set up), and passes his results as an argument, and your server procedure can return True or False. One benefit is that if you change the input to the tests, you need only update the server.Actually, you could let the procedures on the server accept test input and student results, and return True or False. This would be cool :) Caleb On Dec 14, 6:27 pm, Brian Blais [EMAIL PROTECTED] wrote: Hello, I have a couple of classes where I teach introductory programming using Python. What I would love to have is for the students to go through a lot of very small programs, to learn the basic programming structure. Things like, return the maximum in a list, making lists with certain patterns, very simple string parsing, etc. Unfortunately, it takes a lot of time to grade such things by hand, so I would like to automate it as much as possible. I envision a number of possible solutions. In one solution, I provide a function template with a docstring, and they have to fill it in to past a doctest. Is there a good (and safe) way to do that online? Something like having a student post code, and the doctest returns. I'd love to allow them to submit until they get it, logging each attempt. Or perhaps there is a better way to do this sort of thing. How do others who teach Python handle this? thanks, Brian Blais -- - [EMAIL PROTECTED] http://web.bryant.edu/~bblais -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
Dan Bishop wrote: On Dec 14, 8:36 pm, Brian Blais [EMAIL PROTECTED] wrote: [EMAIL PROTECTED] wrote: [EMAIL PROTECTED] wrote: Then on your PC you can run a script that loads each of such programs, and runs a good series of tests, to test their quality... What happens if someone-- perhaps not even someone in the class-- does some version of os.system('rm -Rf /') ?I was thinking of including a dummy os.py and sys.py, so import os, and import sys would fail. Would this work? How would they access their command-line arguments without sys.argv? the types of assignments that I am envisioning (finding the maximum in a list, parsing strings, etc.) will not need anything offered in os or sys. Certainly, if they were needed, another solution would need to be found. bb -- - [EMAIL PROTECTED] http://web.bryant.edu/~bblais -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
Brian Blais wrote: Dan Bishop wrote: On Dec 14, 8:36 pm, Brian Blais [EMAIL PROTECTED] wrote: [EMAIL PROTECTED] wrote: [EMAIL PROTECTED] wrote: Then on your PC you can run a script that loads each of such programs, and runs a good series of tests, to test their quality... What happens if someone-- perhaps not even someone in the class-- does some version of os.system('rm -Rf /') ?I was thinking of including a dummy os.py and sys.py, so import os, and import sys would fail. Would this work? How would they access their command-line arguments without sys.argv? the types of assignments that I am envisioning (finding the maximum in a list, parsing strings, etc.) will not need anything offered in os or sys. Certainly, if they were needed, another solution would need to be found. If you do a search on the web, you will find that there are many other security problems in Python that can not be prevented by simply including dummy modules for os and sys. Brett Cannon's PhD thesis is, afaik, based on looking at ways of creating a secure Python environment. Other suggestions mentioned before (like running in a virtual environment) might be the best way to go for now. Having the user run the program on their own machine (like would be done with the current version of Crunchy already mentioned in this thread) would keep yours safe. Crunchy's doctest feature could be easily modified so that it logs the number of attempts and mail the results to a given address. André -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
On Fri, Dec 15, 2006 at 06:44:37AM +, Dennis Lee Bieber wrote: On Thu, 14 Dec 2006 12:27:07 -0500, Brian Blais [EMAIL PROTECTED] declaimed the following in gmane.comp.python.general: I envision a number of possible solutions. In one solution, I provide a function template with a docstring, and they have to fill it in to past a doctest. Is there a good (and safe) way to do that online? Something like having a student post code, and the doctest returns. I'd love to allow them to submit until they get it, logging each attempt. I have some problems with the concept behind the last sentence... It encourages brute-force trialerror coding (unless you are going to tell them that each submittal gets logged, AND that multiple submittals will reduce the final score they get for the assignment). its been decades since I was in a programming course... salt accordingly. Whenever I learn a new language, I spend a LOT of time just hacking stuff and seeing what it does -- learning syntax and effects by trial and error. Since I already know (okay, knew) good coding practice, the resulting code would not look like it had been hacked together in such a manner, but if I was graded on how many times I executed a bit of code, I'd fail right out. Now, maybe in the second or third semester of a particular language, that might make sense -- the student should already understand syntax and effects well enough to avoid that stuff. .02 from a python newb. A signature.asc Description: Digital signature -- http://mail.python.org/mailman/listinfo/python-list
automatically grading small programming assignments
Hello, I have a couple of classes where I teach introductory programming using Python. What I would love to have is for the students to go through a lot of very small programs, to learn the basic programming structure. Things like, return the maximum in a list, making lists with certain patterns, very simple string parsing, etc. Unfortunately, it takes a lot of time to grade such things by hand, so I would like to automate it as much as possible. I envision a number of possible solutions. In one solution, I provide a function template with a docstring, and they have to fill it in to past a doctest. Is there a good (and safe) way to do that online? Something like having a student post code, and the doctest returns. I'd love to allow them to submit until they get it, logging each attempt. Or perhaps there is a better way to do this sort of thing. How do others who teach Python handle this? thanks, Brian Blais -- - [EMAIL PROTECTED] http://web.bryant.edu/~bblais -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
Brian Blais wrote: Hello, I have a couple of classes where I teach introductory programming using Python. What I would love to have is for the students to go through a lot of very small programs, to learn the basic programming structure. Things like, return the maximum in a list, making lists with certain patterns, very simple string parsing, etc. Unfortunately, it takes a lot of time to grade such things by hand, so I would like to automate it as much as possible. I envision a number of possible solutions. In one solution, I provide a function template with a docstring, and they have to fill it in to past a doctest. Is there a good (and safe) way to do that online? Something like having a student post code, and the doctest returns. I'd love to allow them to submit until they get it, logging each attempt. On a different matter, coding style, you could run something such as Pylint on submitted codes and penalize them based on the number of warnings emitted. Maybe some adjustments would be necessary -- my experience with picky compilers is that most but not ALL warnings indicate problems. -- http://mail.python.org/mailman/listinfo/python-list
Re: [Edu-sig] automatically grading small programming assignments
Hello Brian, I do not teach (much to my regrets) but I have been thinking about what you describe. See below. On 12/14/06, Brian Blais [EMAIL PROTECTED] wrote: Hello, I have a couple of classes where I teach introductory programming using Python. What I would love to have is for the students to go through a lot of very small programs, to learn the basic programming structure. Things like, return the maximum in a list, making lists with certain patterns, very simple string parsing, etc. Unfortunately, it takes a lot of time to grade such things by hand, so I would like to automate it as much as possible. I envision a number of possible solutions. In one solution, I provide a function template with a docstring, and they have to fill it in to past a doctest. Is there a good (and safe) way to do that online? Something like having a student post code, and the doctest returns. I'd love to allow them to submit until they get it, logging each attempt. I may have a partial solution. I (co-)wrote a program called Crunchy (crunchy.sf.net) which, among other features, allow automated correction of code that has to satisfy a given docstring. As it stands, it only allows self-evaluation i.e. there's no login required nor is the solution forwarded to someone else. (This has been requested by others and may, eventually, be incorporated in Crunchy.) So, as a tool for learning, it's working; the grading component is simply not there. However, the code is open-source and you could adopt it to your needs ;-) André Or perhaps there is a better way to do this sort of thing. How do others who teach Python handle this? thanks, Brian Blais -- - [EMAIL PROTECTED] http://web.bryant.edu/~bblaishttp://web.bryant.edu/%7Ebblais ___ Edu-sig mailing list Edu-sig@python.org http://mail.python.org/mailman/listinfo/edu-sig -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
Brian Blais wrote: Hello, I have a couple of classes where I teach introductory programming using Python. What I would love to have is for the students to go through a lot of very small programs, to learn the basic programming structure. Things like, return the maximum in a list, making lists with certain patterns, very simple string parsing, etc. Unfortunately, it takes a lot of time to grade such things by hand, so I would like to automate it as much as possible. I envision a number of possible solutions. In one solution, I provide a function template with a docstring, and they have to fill it in to past a doctest. Is there a good (and safe) way to do that online? Something like having a student post code, and the doctest returns. I'd love to allow them to submit until they get it, logging each attempt. Or perhaps there is a better way to do this sort of thing. How do others who teach Python handle this? It might turn out to be a poor substitute for the personal touch, especially If they are just starting to program. - Paddy. -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
Brian Blais wrote: I have a couple of classes where I teach introductory programming using Python. What I would love to have is for the students to go through a lot of very small programs, to learn the basic programming structure. Things like, return the maximum in a list, making lists with certain patterns, very simple string parsing, etc. Unfortunately, it takes a lot of time to grade such things by hand, so I would like to automate it as much as possible. I envision a number of possible solutions. In one solution, I provide a function template with a docstring, and they have to fill it in to past a doctest. Is there a good (and safe) way to do that online? Something like having a student post code, and the doctest returns. I'd love to allow them to submit until they get it, logging each attempt. A few people were playing around with a secure Python online interpreter/interactive prompt a while back: http://groups-beta.google.com/group/comp.lang.python/browse_thread/thread/66e659942f95b1a0/6f46d738a8859c2f I don't think any of them was quite at the usability level you'd need though. Can you just have the students download one file for each program you want them to run? I'd ship something like:: import unittest # # This is the function you need to fill in # def list_max(items): '''Your code goes here''' # == # Don't modify code below this point # == class Test(unittest.TestCase): def test_range(self): self.failUnlessEqual(list_max(range(10)), 10) def test_first_elem(self): self.failUnlessEqual(list_max([1, 0, 0, 0]), 1) ... if __name__ == '__main__': unittest.main() Then all your students would have to do is download the file, fill in the definition of the appropriate function, and run the same file until the unittest said that everything worked. I guess that doesn't solve your logging each attempt problem though... STeVe -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
Brian Blais, just an idea. Create an online form to upload the tiny program(s). Such programs can be one for file. Then on your PC you can run a script that loads each of such programs, and runs a good series of tests, to test their quality... Such tests can be about all things, speed, coding quality, that the results are correct, etc. You can even put some way to autenticate students... (BTW, good students don't cheat. Cheating isn't a good way to learn to program, and they probably know this). Bye, bearophile -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
[EMAIL PROTECTED] wrote: Then on your PC you can run a script that loads each of such programs, and runs a good series of tests, to test their quality... What happens if someone-- perhaps not even someone in the class-- does some version of os.system('rm -Rf /') ? -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
Am Thu, 14 Dec 2006 12:27:07 -0500 schrieb Brian Blais [EMAIL PROTECTED]: Hello, I have a couple of classes where I teach introductory programming using Python. What I would love to have is for the students to go through a lot of very small programs, to learn the basic programming structure. Things like, return the maximum in a list, making lists with certain patterns, very simple string parsing, etc. Unfortunately, it takes a lot of time to grade such things by hand, so I would like to automate it as much as possible. I envision a number of possible solutions. In one solution, I provide a function template with a docstring, and they have to fill it in to past a doctest. Is there a good (and safe) way to do that online? Something like having a student post code, and the doctest returns. I'd love to allow them to submit until they get it, logging each attempt. Or perhaps there is a better way to do this sort of thing. How do others who teach Python handle this? thanks, Brian Blais Perhaps the Sphere Online Judge can help you: https://www.spoj.pl/info/ Dennis -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
[EMAIL PROTECTED] wrote: Then on your PC you can run a script that loads each of such programs, and runs a good series of tests, to test their quality... What happens if someone-- perhaps not even someone in the class-- does some version of os.system('rm -Rf /') ? -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
[EMAIL PROTECTED] wrote: [EMAIL PROTECTED] wrote: Then on your PC you can run a script that loads each of such programs, and runs a good series of tests, to test their quality... What happens if someone-- perhaps not even someone in the class-- does some version of os.system('rm -Rf /') ? The system administrator should make sure that student user accounts (or the auto testing account) doesn't have access to that. Probably should make sure that user applications only get a limited amount of memory too. -- Carl J. Van Arsdall [EMAIL PROTECTED] Build and Release MontaVista Software -- http://mail.python.org/mailman/listinfo/python-list
Re: [Edu-sig] automatically grading small programming assignments
Brian Blais wrote: I envision a number of possible solutions. In one solution, I provide a function template with a docstring, and they have to fill it in to past a doctest. Is there a good (and safe) way to do that online? Something like having a student post code, and the doctest returns. I'd love to allow them to submit until they get it, logging each attempt. Crunchy Frog (now called Crunchy, as I understand). I just researched and presented on it this past weekend and was impressed with its abilities, including that of the instructor providing a doctest and the student working to write code that lets it pass. Very cool. There is not however currently any logging of progress, counts to get it right, etc. But I would imagine that would not be hard to add to the Crunchy Frog backend. It is just an HTTP proxy with some template expansion to get a Python interpreter inside the browser window. Safety -- ehh. Each Python interpreter is running inside that HTTP proxy with full access to the underlying system, as whatever user it is running as. The design is to have each student run it locally, so they can only trash their own system. However, I could imagine you could set it up to run on a private classroom server, where the attempt records would be kept, and still be safe. http://crunchy.sourceforge.net/index.html For another solution, I wonder whether you could make use of the new Abstract Syntax Tree (AST) in Python 2.5, where you convert the source of an attempt into an abstract data structure, anonymize the method/variable/class names and compare the tree against a correct solution. It would let you quickly handle those students who solved it in a conformist way, and then you'd need to manually review the rest for creatively solving it another way. ;-) But I think Crunchy is the most classroom-friendly way to quickly solve this. A weekend's work at most. -Jeff -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
[EMAIL PROTECTED] wrote: [EMAIL PROTECTED] wrote: Then on your PC you can run a script that loads each of such programs, and runs a good series of tests, to test their quality... What happens if someone-- perhaps not even someone in the class-- does some version of os.system('rm -Rf /') ? I was thinking of including a dummy os.py and sys.py, so import os, and import sys would fail. Would this work? Is there anything else obvious. I can have student authentication, that's not a problem. bb -- - [EMAIL PROTECTED] http://web.bryant.edu/~bblais -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
Paddy wrote: It might turn out to be a poor substitute for the personal touch, especially If they are just starting to program. Oh, I didn't mean it to completely replace me grading things, but I think it would be useful if there were a lot of little assignments that could be done automatically, and then some larger ones that I would grade by hand. The little ones could be set up so that they can submit as many times as they want, until they get it right. bb -- - [EMAIL PROTECTED] http://web.bryant.edu/~bblais -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
Brian Blais wrote: [EMAIL PROTECTED] wrote: [EMAIL PROTECTED] wrote: Then on your PC you can run a script that loads each of such programs, and runs a good series of tests, to test their quality... What happens if someone-- perhaps not even someone in the class-- does some version of os.system('rm -Rf /') ? I was thinking of including a dummy os.py and sys.py, so import os, and import sys would fail. Would this work? Is there anything else obvious. I can have student authentication, that's not a problem. Setting up a Crunchy server in a virtualized OS environment would give you some security. - Paddy. -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
Brian Blais [EMAIL PROTECTED] writes: Unfortunately, it takes a lot of time to grade such things by hand, so I would like to automate it as much as possible. ... Or perhaps there is a better way to do this sort of thing. How do others who teach Python handle this? I think you should not attempt this. It means grading the program on pure functionality. Writing good code is about more than functionality; it's also about communicating with humans. Supplying a doctest is a reasonable idea but students should just run the test themselves until their code passes, no need to log all their debugging attempts. When they submit something that passes the test, you should read the code and supply some feedback about about their coding style and so forth. This matters as much as the pure issue of whether the program works. Imagine an English teacher wanting to grade class papers automatically by running them through a spelling and grammar checker without reading them. If the workload of grading manually is too high, see if you can bring on an assistant to help. -- http://mail.python.org/mailman/listinfo/python-list
Re: automatically grading small programming assignments
On Dec 14, 8:36 pm, Brian Blais [EMAIL PROTECTED] wrote: [EMAIL PROTECTED] wrote: [EMAIL PROTECTED] wrote: Then on your PC you can run a script that loads each of such programs, and runs a good series of tests, to test their quality... What happens if someone-- perhaps not even someone in the class-- does some version of os.system('rm -Rf /') ?I was thinking of including a dummy os.py and sys.py, so import os, and import sys would fail. Would this work? How would they access their command-line arguments without sys.argv? -- http://mail.python.org/mailman/listinfo/python-list