1. run jobs via perl programs, with a web-service interface rather than write
a script that uses http and fills in the form, sends the submit messages
and waits for the output
2. programmatic submission and result retrieval, more robust than it would be
when scraping through html pages
3. they can programmatically limit the number of requests per time unit.
4. what kind of interface does Hilmar have in mind:
post a file, a string to specify type of analysis (raxml, garli, rid3-raxml)
and poll for results?
5. client sends via POST
server interface returns (in the body):
queued
accepted