Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Are Computer Science exercises a good fit the exercism platfom. #124

Closed
remcopeereboom opened this issue Mar 11, 2017 · 9 comments
Closed

Comments

@remcopeereboom
Copy link

I am noticing that a lot of the exercisms that ask users to implement a data structure or classical algorithm are not a very good fit on exercism. We've had some discussion about this on the xruby repository (e.g. #495 #513 on xruby) and I was wondering if other people have noticed the same thing in other language tracks.

Some issues:

  1. No freedom to explore
    As it stands many of the tests force a very distinct implementation. This may be important didactically, but is kind of counter to exploring the problem. I've also found that many of the implementations forced by the tests are imho just not very good. But clearly other people disagree so I feel there should be more leeway
  2. No discussion about use cases
    Many of the data structures are used for very particular thing. There are reasons why in some cases you might want to use a linked list and in others you might want to use an array (to name but one example), but the problems do not make this clear.
  3. No discussion about performance
    A lot of the data structures and classic algorithms require knowledge about basic algorithmic analysis. There is a binary search exercism on xruby that requires people to check that the input is sorted. But such a check always takes time O(n), which means that the algorithm overall is going to take O(n) time, while the whole point of doing binary search in the first place is to do so in O(lg n).

Is exercism really the best place to teach people these kinds of things? Clearly I am skeptical that it is. There are a lot of places online where people can learn about algorithms and data structures and I feel that they do a much better job of it. But I am also aware that I am often overly skeptical. I can see some possible solutions.

  1. Instead of asking users to implement algorithms like binary search, we should consider asking users to implement general search (possibly with the given constraint that the input will be sorted).
  2. Instead of testing for data structure implementation details, we could implement stress tests to check that the input indeed subscribes to the desired asymptotic performance. If nothing else we could add large input tests to make sure that "wrong" implementations also "feel wrong" to clients. Possible downside: I can see how not testing for implementation details makes it harder for people unfamiliar with the data structures to "get it".
@rbasso
Copy link

rbasso commented Mar 11, 2017

If an exercise demands a specific algorithm/data structure, It goes against what I think is a more general rule:

We should never direct a solution. That takes away all the fun of the discovery.

Or, as already discussed in another terms, Exercises should not be over-prescriptive.

A useful algorithm/data structure probably can be used to solve at least a single practical problem. This problem could be a good exercise, without making any reference to how to solve it.

The classical example is sorting, which is a practical problem. Another one is the Travelling Salesman Problem, which can be clearly stated, is language-neutral and allow multiple heuristics.

Taking a more concrete example, I think it would be way more interesting to have a Find the shortest path between The Shire and Mordor exercise than a Implement the Dijkstra algorithm.

@kytrinyx
Copy link
Member

This is an excellent observation. I think that the solution is as rbasso stated it:

A useful algorithm/data structure probably can be used to solve at least a single practical problem. This problem could a good exercise, without making any reference to how to solve it.

Let's deprecate the CS/algorithm exercises in favor of a more generally stated problem that could use that algorithm.

@petertseng
Copy link
Member

petertseng commented Mar 14, 2017

I always saw that it was hard for our tests to be sure that the student solutions implements the desired algorithm (e.g. in sieve or binary-search). After all, the tests typically check that the output for a given input is as desired, and there can be many ways to arrive at this output. And thus I saw many solutions not even try to implement a sieve.

Some languages may be able to instrument their arrays to keep access counters and thus check whether binary search performs the proper number of accesses, but not available to all languages.

Those are the two most prominent exercises I know of that seem to ask for a specific algorithm to do a general task.

As for the data structures problems. All I can say is that I tend not to do them, but that's just me.

@NobbZ
Copy link
Member

NobbZ commented Mar 14, 2017

I think as well, that it is hard to test for a given implementation in most languages, therefore it is hard to "enforce" specific algorithms, therefore I'd say that asking for a specific algorithm should not be part or goal of exercism.

But I have to admit, that I also learned about new algorithms in exercism that were not covered by algorithms or FP lessons in my university. The most prominent in my mind is "zipper", which I wasn't aware of before doing the exercise in xelixir but today I use it on a regular basis.

So I think we should change the way asking for this.

Instead of asking "Write a program that implements erastosthenes sieve." we should ask "Write a program which returns a list of prime numbers, there are a couple of efficient algorithms: erastostenes sive, wheel sieve, etc", perhaps even with links.

@kotp
Copy link
Member

kotp commented Mar 14, 2017

I like the "introduction to available algorithms" rather than the direct suggested requirement.

It gives options, that otherwise may not be known... so more room for learning, comparing, discovery.

@ErikSchierboom
Copy link
Member

I love @NobbZ's suggestion! It really allows for a lot of freedom on the part of the user.

@jtigger
Copy link

jtigger commented Mar 16, 2017

This is an excellent observation, @remcopeereboom, thank you for thoroughly explaining it.

I really like @NobbZ's balance of both a wider field with guardrails for the less-experienced. Yes, with links!

I suspect that the more we soften up some of the later exercises (while leaving a breadcrumb trail so that Early Programmers can continue to have footing) the more room there is for interesting dialogue.

@Insti
Copy link

Insti commented Apr 16, 2017

I think we have a policy conclusion thanks to @remcopeereboom and @NobbZ:

Instead of asking "Write a program that implements erastosthenes sieve." we should ask "Write a program which returns a list of prime numbers, there are a couple of efficient algorithms: erastostenes sive, wheel sieve, etc", perhaps even with links.

@kytrinyx
Copy link
Member

Agreed, thank you @Insti.

I've opened a new issue based on the conclusion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants